Jan 27 04:45:58.620950 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 27 04:45:58.620974 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Jan 27 03:12:56 -00 2026 Jan 27 04:45:58.620985 kernel: KASLR enabled Jan 27 04:45:58.620991 kernel: efi: EFI v2.7 by EDK II Jan 27 04:45:58.620996 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Jan 27 04:45:58.621002 kernel: random: crng init done Jan 27 04:45:58.621009 kernel: secureboot: Secure boot disabled Jan 27 04:45:58.621015 kernel: ACPI: Early table checksum verification disabled Jan 27 04:45:58.621021 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 27 04:45:58.621029 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 27 04:45:58.621036 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621042 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621049 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621055 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621064 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621070 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621077 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621084 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621106 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621113 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 27 04:45:58.621119 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 27 04:45:58.621132 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 27 04:45:58.621139 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 27 04:45:58.621147 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 27 04:45:58.621154 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 27 04:45:58.621160 kernel: Zone ranges: Jan 27 04:45:58.621167 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 27 04:45:58.621173 kernel: DMA32 empty Jan 27 04:45:58.621180 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 27 04:45:58.621186 kernel: Device empty Jan 27 04:45:58.621193 kernel: Movable zone start for each node Jan 27 04:45:58.621199 kernel: Early memory node ranges Jan 27 04:45:58.621205 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 27 04:45:58.621212 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 27 04:45:58.621218 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 27 04:45:58.621225 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 27 04:45:58.621232 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 27 04:45:58.621238 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 27 04:45:58.621245 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 27 04:45:58.621251 kernel: psci: probing for conduit method from ACPI. Jan 27 04:45:58.621261 kernel: psci: PSCIv1.3 detected in firmware. Jan 27 04:45:58.621269 kernel: psci: Using standard PSCI v0.2 function IDs Jan 27 04:45:58.621276 kernel: psci: Trusted OS migration not required Jan 27 04:45:58.621282 kernel: psci: SMC Calling Convention v1.1 Jan 27 04:45:58.621289 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 27 04:45:58.621296 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 27 04:45:58.621303 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 27 04:45:58.621310 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 27 04:45:58.621317 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 27 04:45:58.621325 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 27 04:45:58.621331 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 27 04:45:58.621338 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 27 04:45:58.621345 kernel: Detected PIPT I-cache on CPU0 Jan 27 04:45:58.621352 kernel: CPU features: detected: GIC system register CPU interface Jan 27 04:45:58.621359 kernel: CPU features: detected: Spectre-v4 Jan 27 04:45:58.621365 kernel: CPU features: detected: Spectre-BHB Jan 27 04:45:58.621372 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 27 04:45:58.621379 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 27 04:45:58.621386 kernel: CPU features: detected: ARM erratum 1418040 Jan 27 04:45:58.621392 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 27 04:45:58.621401 kernel: alternatives: applying boot alternatives Jan 27 04:45:58.621409 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=e8d94e976e545d7a75a81392e4f736b09fc9f1bd0b9dfe995e69ba02d19f509a Jan 27 04:45:58.621416 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 27 04:45:58.621423 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 27 04:45:58.621429 kernel: Fallback order for Node 0: 0 Jan 27 04:45:58.621436 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 27 04:45:58.621443 kernel: Policy zone: Normal Jan 27 04:45:58.621450 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 27 04:45:58.621457 kernel: software IO TLB: area num 4. Jan 27 04:45:58.621464 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 27 04:45:58.621472 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 27 04:45:58.621479 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 27 04:45:58.621486 kernel: rcu: RCU event tracing is enabled. Jan 27 04:45:58.621493 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 27 04:45:58.621500 kernel: Trampoline variant of Tasks RCU enabled. Jan 27 04:45:58.621507 kernel: Tracing variant of Tasks RCU enabled. Jan 27 04:45:58.621514 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 27 04:45:58.621521 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 27 04:45:58.621528 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 27 04:45:58.621535 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 27 04:45:58.621542 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 27 04:45:58.621549 kernel: GICv3: 256 SPIs implemented Jan 27 04:45:58.621556 kernel: GICv3: 0 Extended SPIs implemented Jan 27 04:45:58.621563 kernel: Root IRQ handler: gic_handle_irq Jan 27 04:45:58.621570 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 27 04:45:58.621576 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 27 04:45:58.621583 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 27 04:45:58.621590 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 27 04:45:58.621596 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 27 04:45:58.621604 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 27 04:45:58.621611 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 27 04:45:58.621617 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 27 04:45:58.621624 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 27 04:45:58.621633 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 27 04:45:58.621640 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 27 04:45:58.621646 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 27 04:45:58.621653 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 27 04:45:58.621660 kernel: arm-pv: using stolen time PV Jan 27 04:45:58.621668 kernel: Console: colour dummy device 80x25 Jan 27 04:45:58.621675 kernel: ACPI: Core revision 20240827 Jan 27 04:45:58.621682 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 27 04:45:58.621691 kernel: pid_max: default: 32768 minimum: 301 Jan 27 04:45:58.621698 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 27 04:45:58.621705 kernel: landlock: Up and running. Jan 27 04:45:58.621712 kernel: SELinux: Initializing. Jan 27 04:45:58.621719 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 27 04:45:58.621727 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 27 04:45:58.621734 kernel: rcu: Hierarchical SRCU implementation. Jan 27 04:45:58.621741 kernel: rcu: Max phase no-delay instances is 400. Jan 27 04:45:58.621750 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 27 04:45:58.621757 kernel: Remapping and enabling EFI services. Jan 27 04:45:58.621764 kernel: smp: Bringing up secondary CPUs ... Jan 27 04:45:58.621771 kernel: Detected PIPT I-cache on CPU1 Jan 27 04:45:58.621779 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 27 04:45:58.621786 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 27 04:45:58.621793 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 27 04:45:58.621802 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 27 04:45:58.621809 kernel: Detected PIPT I-cache on CPU2 Jan 27 04:45:58.621821 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 27 04:45:58.621830 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 27 04:45:58.621838 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 27 04:45:58.621845 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 27 04:45:58.621852 kernel: Detected PIPT I-cache on CPU3 Jan 27 04:45:58.621860 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 27 04:45:58.621869 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 27 04:45:58.621876 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 27 04:45:58.621884 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 27 04:45:58.621891 kernel: smp: Brought up 1 node, 4 CPUs Jan 27 04:45:58.621899 kernel: SMP: Total of 4 processors activated. Jan 27 04:45:58.621906 kernel: CPU: All CPU(s) started at EL1 Jan 27 04:45:58.621915 kernel: CPU features: detected: 32-bit EL0 Support Jan 27 04:45:58.621923 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 27 04:45:58.621930 kernel: CPU features: detected: Common not Private translations Jan 27 04:45:58.621938 kernel: CPU features: detected: CRC32 instructions Jan 27 04:45:58.621945 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 27 04:45:58.621953 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 27 04:45:58.621961 kernel: CPU features: detected: LSE atomic instructions Jan 27 04:45:58.621969 kernel: CPU features: detected: Privileged Access Never Jan 27 04:45:58.621977 kernel: CPU features: detected: RAS Extension Support Jan 27 04:45:58.621984 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 27 04:45:58.621992 kernel: alternatives: applying system-wide alternatives Jan 27 04:45:58.622000 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 27 04:45:58.622008 kernel: Memory: 16324368K/16777216K available (11200K kernel code, 2458K rwdata, 9092K rodata, 12480K init, 1038K bss, 430064K reserved, 16384K cma-reserved) Jan 27 04:45:58.622015 kernel: devtmpfs: initialized Jan 27 04:45:58.622024 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 27 04:45:58.622032 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 27 04:45:58.622039 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 27 04:45:58.622047 kernel: 0 pages in range for non-PLT usage Jan 27 04:45:58.622054 kernel: 515152 pages in range for PLT usage Jan 27 04:45:58.622062 kernel: pinctrl core: initialized pinctrl subsystem Jan 27 04:45:58.622069 kernel: SMBIOS 3.0.0 present. Jan 27 04:45:58.622077 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 27 04:45:58.622086 kernel: DMI: Memory slots populated: 1/1 Jan 27 04:45:58.622100 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 27 04:45:58.622108 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 27 04:45:58.622115 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 27 04:45:58.622123 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 27 04:45:58.622131 kernel: audit: initializing netlink subsys (disabled) Jan 27 04:45:58.622138 kernel: audit: type=2000 audit(0.037:1): state=initialized audit_enabled=0 res=1 Jan 27 04:45:58.622147 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 27 04:45:58.622155 kernel: cpuidle: using governor menu Jan 27 04:45:58.622163 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 27 04:45:58.622170 kernel: ASID allocator initialised with 32768 entries Jan 27 04:45:58.622178 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 27 04:45:58.622185 kernel: Serial: AMBA PL011 UART driver Jan 27 04:45:58.622193 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 27 04:45:58.622202 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 27 04:45:58.622209 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 27 04:45:58.622217 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 27 04:45:58.622224 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 27 04:45:58.622232 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 27 04:45:58.622240 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 27 04:45:58.622258 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 27 04:45:58.622269 kernel: ACPI: Added _OSI(Module Device) Jan 27 04:45:58.622277 kernel: ACPI: Added _OSI(Processor Device) Jan 27 04:45:58.622284 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 27 04:45:58.622291 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 27 04:45:58.622299 kernel: ACPI: Interpreter enabled Jan 27 04:45:58.622306 kernel: ACPI: Using GIC for interrupt routing Jan 27 04:45:58.622314 kernel: ACPI: MCFG table detected, 1 entries Jan 27 04:45:58.622321 kernel: ACPI: CPU0 has been hot-added Jan 27 04:45:58.622330 kernel: ACPI: CPU1 has been hot-added Jan 27 04:45:58.622337 kernel: ACPI: CPU2 has been hot-added Jan 27 04:45:58.622345 kernel: ACPI: CPU3 has been hot-added Jan 27 04:45:58.622352 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 27 04:45:58.622360 kernel: printk: legacy console [ttyAMA0] enabled Jan 27 04:45:58.622367 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 27 04:45:58.622528 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 27 04:45:58.622617 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 27 04:45:58.622697 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 27 04:45:58.622776 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 27 04:45:58.622856 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 27 04:45:58.622865 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 27 04:45:58.622873 kernel: PCI host bridge to bus 0000:00 Jan 27 04:45:58.622964 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 27 04:45:58.623040 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 27 04:45:58.623148 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 27 04:45:58.623245 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 27 04:45:58.623348 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 27 04:45:58.623440 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.623523 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 27 04:45:58.623622 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 27 04:45:58.623703 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 27 04:45:58.623781 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 27 04:45:58.623868 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.623949 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 27 04:45:58.624029 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 27 04:45:58.624131 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 27 04:45:58.624224 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.624303 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 27 04:45:58.624384 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 27 04:45:58.624463 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 27 04:45:58.624542 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 27 04:45:58.624629 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.624708 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 27 04:45:58.624786 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 27 04:45:58.624866 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 27 04:45:58.624951 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.625031 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 27 04:45:58.625132 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 27 04:45:58.625223 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 27 04:45:58.625346 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 27 04:45:58.625443 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.625524 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 27 04:45:58.625623 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 27 04:45:58.625702 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 27 04:45:58.625782 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 27 04:45:58.625886 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.625968 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 27 04:45:58.626045 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 27 04:45:58.626150 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.626231 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 27 04:45:58.626322 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 27 04:45:58.626407 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.626497 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 27 04:45:58.626575 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 27 04:45:58.626661 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.626740 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 27 04:45:58.626820 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 27 04:45:58.626905 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.626984 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 27 04:45:58.627061 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 27 04:45:58.627159 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.627240 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 27 04:45:58.627324 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 27 04:45:58.627412 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.627492 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 27 04:45:58.627569 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 27 04:45:58.627656 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.627744 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 27 04:45:58.627823 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 27 04:45:58.627908 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.627986 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 27 04:45:58.628063 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 27 04:45:58.628203 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.628294 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 27 04:45:58.628373 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 27 04:45:58.628458 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.628537 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 27 04:45:58.628616 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 27 04:45:58.628702 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.628785 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 27 04:45:58.628865 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 27 04:45:58.628945 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 27 04:45:58.629023 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 27 04:45:58.629124 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.629207 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 27 04:45:58.629291 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 27 04:45:58.629368 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 27 04:45:58.629445 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 27 04:45:58.629531 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.629609 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 27 04:45:58.629685 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 27 04:45:58.629763 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 27 04:45:58.629839 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 27 04:45:58.629922 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.630000 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 27 04:45:58.630077 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 27 04:45:58.630164 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 27 04:45:58.630255 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 27 04:45:58.630353 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.630434 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 27 04:45:58.630513 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 27 04:45:58.630592 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 27 04:45:58.630670 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 27 04:45:58.630759 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.630839 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 27 04:45:58.630916 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 27 04:45:58.630994 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 27 04:45:58.631077 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 27 04:45:58.631190 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.631279 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 27 04:45:58.631359 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 27 04:45:58.631441 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 27 04:45:58.631546 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 27 04:45:58.631638 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.631719 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 27 04:45:58.631798 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 27 04:45:58.631875 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 27 04:45:58.631951 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 27 04:45:58.632035 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.632130 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 27 04:45:58.632213 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 27 04:45:58.632290 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 27 04:45:58.632367 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 27 04:45:58.632458 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.632539 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 27 04:45:58.632619 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 27 04:45:58.632698 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 27 04:45:58.632775 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 27 04:45:58.632858 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.632936 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 27 04:45:58.633013 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 27 04:45:58.633127 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 27 04:45:58.633207 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 27 04:45:58.633295 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.633374 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 27 04:45:58.633450 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 27 04:45:58.633530 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 27 04:45:58.633607 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 27 04:45:58.633691 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.633769 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 27 04:45:58.633845 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 27 04:45:58.633921 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 27 04:45:58.634001 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 27 04:45:58.634085 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.634177 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 27 04:45:58.634265 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 27 04:45:58.634347 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 27 04:45:58.634425 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 27 04:45:58.634513 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.634592 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 27 04:45:58.634668 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 27 04:45:58.634746 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 27 04:45:58.634823 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 27 04:45:58.634909 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 27 04:45:58.634991 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 27 04:45:58.635068 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 27 04:45:58.635160 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 27 04:45:58.635240 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 27 04:45:58.635333 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 27 04:45:58.635417 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 27 04:45:58.635497 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 27 04:45:58.635575 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 27 04:45:58.635660 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 27 04:45:58.635740 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 27 04:45:58.635826 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 27 04:45:58.635908 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 27 04:45:58.635988 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 27 04:45:58.636075 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 27 04:45:58.636170 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 27 04:45:58.636261 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 27 04:45:58.636351 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 27 04:45:58.636432 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 27 04:45:58.636523 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 27 04:45:58.636603 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 27 04:45:58.636683 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 27 04:45:58.636765 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 27 04:45:58.636847 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 27 04:45:58.636924 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 27 04:45:58.637006 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 27 04:45:58.637084 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 27 04:45:58.637180 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 27 04:45:58.637264 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 27 04:45:58.637344 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 27 04:45:58.637424 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 27 04:45:58.637548 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 27 04:45:58.637636 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 27 04:45:58.637716 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 27 04:45:58.637798 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 27 04:45:58.637877 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 27 04:45:58.637955 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 27 04:45:58.638035 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 27 04:45:58.638128 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 27 04:45:58.638227 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 27 04:45:58.638322 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 27 04:45:58.638402 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 27 04:45:58.638483 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 27 04:45:58.638566 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 27 04:45:58.638646 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 27 04:45:58.638724 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 27 04:45:58.638805 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 27 04:45:58.638883 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 27 04:45:58.638960 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 27 04:45:58.639063 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 27 04:45:58.639160 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 27 04:45:58.639241 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 27 04:45:58.639324 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 27 04:45:58.639424 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 27 04:45:58.639502 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 27 04:45:58.639588 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 27 04:45:58.639665 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 27 04:45:58.639742 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 27 04:45:58.639824 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 27 04:45:58.639901 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 27 04:45:58.639993 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 27 04:45:58.640084 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 27 04:45:58.640173 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 27 04:45:58.640253 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 27 04:45:58.640340 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 27 04:45:58.640419 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 27 04:45:58.640501 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 27 04:45:58.640583 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 27 04:45:58.640662 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 27 04:45:58.640741 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 27 04:45:58.640824 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 27 04:45:58.640906 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 27 04:45:58.640985 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 27 04:45:58.641066 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 27 04:45:58.641162 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 27 04:45:58.641241 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 27 04:45:58.641322 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 27 04:45:58.641403 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 27 04:45:58.641481 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 27 04:45:58.641563 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 27 04:45:58.641642 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 27 04:45:58.641721 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 27 04:45:58.641802 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 27 04:45:58.641887 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 27 04:45:58.641968 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 27 04:45:58.642052 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 27 04:45:58.642141 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 27 04:45:58.642220 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 27 04:45:58.642326 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 27 04:45:58.642407 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 27 04:45:58.642486 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 27 04:45:58.642570 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 27 04:45:58.642651 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 27 04:45:58.642732 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 27 04:45:58.642821 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 27 04:45:58.642905 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 27 04:45:58.643005 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 27 04:45:58.643097 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 27 04:45:58.643183 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 27 04:45:58.643262 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 27 04:45:58.643349 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 27 04:45:58.643428 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 27 04:45:58.643507 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 27 04:45:58.643588 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 27 04:45:58.643670 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 27 04:45:58.643753 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 27 04:45:58.643838 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 27 04:45:58.643920 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 27 04:45:58.644021 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 27 04:45:58.644115 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 27 04:45:58.644199 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 27 04:45:58.644281 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 27 04:45:58.644364 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 27 04:45:58.644444 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 27 04:45:58.644522 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 27 04:45:58.644604 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 27 04:45:58.644684 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 27 04:45:58.644765 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 27 04:45:58.644849 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 27 04:45:58.644928 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 27 04:45:58.645010 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 27 04:45:58.645110 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 27 04:45:58.645198 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 27 04:45:58.645283 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 27 04:45:58.645363 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 27 04:45:58.645444 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 27 04:45:58.645529 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 27 04:45:58.645610 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 27 04:45:58.645696 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 27 04:45:58.645803 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 27 04:45:58.645882 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 27 04:45:58.645963 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 27 04:45:58.646043 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 27 04:45:58.646137 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 27 04:45:58.646217 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 27 04:45:58.646316 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 27 04:45:58.646400 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 27 04:45:58.646482 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 27 04:45:58.646564 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 27 04:45:58.646648 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 27 04:45:58.646727 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 27 04:45:58.646813 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 27 04:45:58.646900 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 27 04:45:58.646982 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 27 04:45:58.647063 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 27 04:45:58.647194 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 27 04:45:58.647281 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 27 04:45:58.647365 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 27 04:45:58.647531 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 27 04:45:58.647626 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 27 04:45:58.647706 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 27 04:45:58.647789 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 27 04:45:58.647883 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 27 04:45:58.647965 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 27 04:45:58.648080 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 27 04:45:58.648215 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 27 04:45:58.648297 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 27 04:45:58.648384 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 27 04:45:58.648465 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 27 04:45:58.648549 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 27 04:45:58.648635 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 27 04:45:58.648717 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 27 04:45:58.648799 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 27 04:45:58.648883 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 27 04:45:58.648964 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 27 04:45:58.649046 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 27 04:45:58.649143 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 27 04:45:58.649230 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 27 04:45:58.649310 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 27 04:45:58.649394 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 27 04:45:58.649476 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 27 04:45:58.649558 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 27 04:45:58.649640 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 27 04:45:58.649728 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 27 04:45:58.649841 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 27 04:45:58.649928 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 27 04:45:58.650014 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 27 04:45:58.650113 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 27 04:45:58.650215 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 27 04:45:58.650318 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 27 04:45:58.650402 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 27 04:45:58.650487 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 27 04:45:58.650566 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 27 04:45:58.650647 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 27 04:45:58.650726 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 27 04:45:58.650807 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 27 04:45:58.650891 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 27 04:45:58.650974 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 27 04:45:58.651056 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 27 04:45:58.651159 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 27 04:45:58.651247 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 27 04:45:58.651329 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 27 04:45:58.651411 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 27 04:45:58.651492 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 27 04:45:58.651572 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 27 04:45:58.651658 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 27 04:45:58.651736 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 27 04:45:58.651819 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 27 04:45:58.651900 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 27 04:45:58.651980 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 27 04:45:58.652058 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 27 04:45:58.652150 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 27 04:45:58.652235 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 27 04:45:58.652316 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 27 04:45:58.652395 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 27 04:45:58.652478 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 27 04:45:58.652555 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 27 04:45:58.652634 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 27 04:45:58.652712 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 27 04:45:58.652792 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 27 04:45:58.652870 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 27 04:45:58.652952 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 27 04:45:58.653030 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 27 04:45:58.653119 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 27 04:45:58.653198 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 27 04:45:58.653279 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 27 04:45:58.653360 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 27 04:45:58.653441 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 27 04:45:58.653536 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.653620 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.653702 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 27 04:45:58.653782 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.653863 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.653944 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 27 04:45:58.654023 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.654131 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.654215 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 27 04:45:58.654312 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.654397 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.654481 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 27 04:45:58.654562 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.654643 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.654726 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 27 04:45:58.654808 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.654887 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.654971 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 27 04:45:58.655052 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.655159 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.655245 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 27 04:45:58.655346 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.655425 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.655513 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 27 04:45:58.655592 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.655670 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.655749 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 27 04:45:58.655828 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.655905 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.655988 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 27 04:45:58.656065 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.656158 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.656239 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 27 04:45:58.656317 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.656404 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.656490 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 27 04:45:58.656576 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.656654 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.656734 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 27 04:45:58.656812 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.656890 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.656970 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 27 04:45:58.657069 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.657168 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.657251 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 27 04:45:58.657330 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.657408 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.657488 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 27 04:45:58.657566 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.657647 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.657727 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 27 04:45:58.657804 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.657882 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.657960 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 27 04:45:58.658038 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 27 04:45:58.658129 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 27 04:45:58.658208 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 27 04:45:58.658301 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 27 04:45:58.658387 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 27 04:45:58.658466 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 27 04:45:58.658544 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 27 04:45:58.658623 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 27 04:45:58.658703 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 27 04:45:58.658782 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 27 04:45:58.658860 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 27 04:45:58.658940 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 27 04:45:58.659022 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 27 04:45:58.659126 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 27 04:45:58.659210 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.659293 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.659373 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.659452 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.659531 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.659608 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.659688 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.659766 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.659848 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.659926 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.660005 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.660082 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.660174 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.660253 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.660332 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.660413 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.660492 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.660569 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.660648 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.660727 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.660807 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.660888 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.660967 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.661045 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.661135 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.661214 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.661295 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.661375 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.661454 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.661533 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.661613 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.661690 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.661772 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.661851 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.661930 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 27 04:45:58.662008 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 27 04:45:58.662101 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 27 04:45:58.662188 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 27 04:45:58.662282 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 27 04:45:58.662363 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 27 04:45:58.662442 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 27 04:45:58.662520 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 27 04:45:58.662605 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 27 04:45:58.662682 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 27 04:45:58.662763 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 27 04:45:58.662840 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 27 04:45:58.662924 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 27 04:45:58.663005 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 27 04:45:58.663083 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 27 04:45:58.663184 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 27 04:45:58.663266 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 27 04:45:58.663349 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 27 04:45:58.663427 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 27 04:45:58.663505 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 27 04:45:58.663583 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 27 04:45:58.663670 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 27 04:45:58.663750 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 27 04:45:58.663827 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 27 04:45:58.663905 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 27 04:45:58.663983 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 27 04:45:58.664068 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 27 04:45:58.664166 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 27 04:45:58.664247 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 27 04:45:58.664326 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 27 04:45:58.664403 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 27 04:45:58.664482 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 27 04:45:58.664559 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 27 04:45:58.664638 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 27 04:45:58.664718 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 27 04:45:58.664796 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 27 04:45:58.664874 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 27 04:45:58.664952 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 27 04:45:58.665030 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 27 04:45:58.665119 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 27 04:45:58.665199 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 27 04:45:58.665277 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 27 04:45:58.665356 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 27 04:45:58.665434 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 27 04:45:58.665512 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 27 04:45:58.665592 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 27 04:45:58.665670 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 27 04:45:58.665749 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 27 04:45:58.665827 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 27 04:45:58.665905 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 27 04:45:58.665985 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 27 04:45:58.666063 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 27 04:45:58.666153 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 27 04:45:58.666233 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 27 04:45:58.666326 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 27 04:45:58.666410 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 27 04:45:58.666488 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 27 04:45:58.666567 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 27 04:45:58.666647 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 27 04:45:58.666726 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 27 04:45:58.666804 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 27 04:45:58.666886 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 27 04:45:58.666965 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 27 04:45:58.667043 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 27 04:45:58.667143 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 27 04:45:58.667226 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 27 04:45:58.667305 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 27 04:45:58.667386 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 27 04:45:58.667464 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 27 04:45:58.667543 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 27 04:45:58.667620 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 27 04:45:58.667699 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 27 04:45:58.667777 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 27 04:45:58.667856 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 27 04:45:58.667937 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 27 04:45:58.668016 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 27 04:45:58.668103 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 27 04:45:58.668185 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 27 04:45:58.668262 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 27 04:45:58.668342 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 27 04:45:58.668424 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 27 04:45:58.668502 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 27 04:45:58.668581 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 27 04:45:58.668661 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 27 04:45:58.668740 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 27 04:45:58.668818 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 27 04:45:58.668895 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 27 04:45:58.668976 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 27 04:45:58.669054 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 27 04:45:58.669142 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 27 04:45:58.669222 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 27 04:45:58.669302 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 27 04:45:58.669380 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 27 04:45:58.669461 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 27 04:45:58.669539 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 27 04:45:58.669621 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 27 04:45:58.669700 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 27 04:45:58.669778 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 27 04:45:58.669856 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 27 04:45:58.669935 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 27 04:45:58.670017 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 27 04:45:58.670103 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 27 04:45:58.670183 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 27 04:45:58.670275 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 27 04:45:58.670358 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 27 04:45:58.670436 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 27 04:45:58.670514 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 27 04:45:58.670596 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 27 04:45:58.670675 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 27 04:45:58.670753 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 27 04:45:58.670831 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 27 04:45:58.670911 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 27 04:45:58.670989 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 27 04:45:58.671069 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 27 04:45:58.671175 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 27 04:45:58.671257 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 27 04:45:58.671338 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 27 04:45:58.671417 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 27 04:45:58.671495 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 27 04:45:58.671576 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 27 04:45:58.671659 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 27 04:45:58.671737 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 27 04:45:58.671814 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 27 04:45:58.671895 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 27 04:45:58.671974 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 27 04:45:58.672052 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 27 04:45:58.672147 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 27 04:45:58.672268 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 27 04:45:58.672347 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 27 04:45:58.672419 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 27 04:45:58.672504 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 27 04:45:58.672580 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 27 04:45:58.672664 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 27 04:45:58.672738 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 27 04:45:58.672827 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 27 04:45:58.672901 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 27 04:45:58.672981 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 27 04:45:58.673056 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 27 04:45:58.673549 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 27 04:45:58.673626 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 27 04:45:58.673707 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 27 04:45:58.673780 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 27 04:45:58.673858 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 27 04:45:58.673934 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 27 04:45:58.674014 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 27 04:45:58.674105 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 27 04:45:58.674193 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 27 04:45:58.674280 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 27 04:45:58.674368 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 27 04:45:58.674443 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 27 04:45:58.674523 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 27 04:45:58.674597 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 27 04:45:58.674684 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 27 04:45:58.674759 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 27 04:45:58.674838 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 27 04:45:58.674912 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 27 04:45:58.674993 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 27 04:45:58.675066 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 27 04:45:58.675161 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 27 04:45:58.675237 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 27 04:45:58.675322 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 27 04:45:58.675395 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 27 04:45:58.675475 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 27 04:45:58.675551 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 27 04:45:58.675631 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 27 04:45:58.675704 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 27 04:45:58.675783 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 27 04:45:58.675856 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 27 04:45:58.675930 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 27 04:45:58.676009 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 27 04:45:58.676082 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 27 04:45:58.676166 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 27 04:45:58.676252 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 27 04:45:58.676325 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 27 04:45:58.676401 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 27 04:45:58.676481 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 27 04:45:58.676554 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 27 04:45:58.676626 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 27 04:45:58.676704 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 27 04:45:58.676784 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 27 04:45:58.676872 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 27 04:45:58.676954 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 27 04:45:58.677027 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 27 04:45:58.677116 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 27 04:45:58.677198 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 27 04:45:58.677274 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 27 04:45:58.677346 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 27 04:45:58.677424 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 27 04:45:58.677497 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 27 04:45:58.677578 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 27 04:45:58.677659 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 27 04:45:58.677735 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 27 04:45:58.677808 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 27 04:45:58.677896 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 27 04:45:58.677971 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 27 04:45:58.678044 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 27 04:45:58.678151 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 27 04:45:58.678228 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 27 04:45:58.678316 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 27 04:45:58.678401 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 27 04:45:58.678475 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 27 04:45:58.678549 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 27 04:45:58.678633 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 27 04:45:58.678709 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 27 04:45:58.678784 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 27 04:45:58.678866 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 27 04:45:58.678975 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 27 04:45:58.679054 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 27 04:45:58.679165 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 27 04:45:58.679248 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 27 04:45:58.679322 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 27 04:45:58.679332 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 27 04:45:58.679341 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 27 04:45:58.679349 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 27 04:45:58.679359 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 27 04:45:58.679368 kernel: iommu: Default domain type: Translated Jan 27 04:45:58.679376 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 27 04:45:58.679385 kernel: efivars: Registered efivars operations Jan 27 04:45:58.679392 kernel: vgaarb: loaded Jan 27 04:45:58.679400 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 27 04:45:58.679408 kernel: VFS: Disk quotas dquot_6.6.0 Jan 27 04:45:58.679418 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 27 04:45:58.679426 kernel: pnp: PnP ACPI init Jan 27 04:45:58.679516 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 27 04:45:58.679528 kernel: pnp: PnP ACPI: found 1 devices Jan 27 04:45:58.679536 kernel: NET: Registered PF_INET protocol family Jan 27 04:45:58.679544 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 27 04:45:58.679554 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 27 04:45:58.679562 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 27 04:45:58.679570 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 27 04:45:58.679578 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 27 04:45:58.679586 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 27 04:45:58.679595 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 27 04:45:58.679603 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 27 04:45:58.679612 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 27 04:45:58.679698 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 27 04:45:58.679710 kernel: PCI: CLS 0 bytes, default 64 Jan 27 04:45:58.679726 kernel: kvm [1]: HYP mode not available Jan 27 04:45:58.679735 kernel: Initialise system trusted keyrings Jan 27 04:45:58.679743 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 27 04:45:58.679751 kernel: Key type asymmetric registered Jan 27 04:45:58.679760 kernel: Asymmetric key parser 'x509' registered Jan 27 04:45:58.679768 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 27 04:45:58.679776 kernel: io scheduler mq-deadline registered Jan 27 04:45:58.679785 kernel: io scheduler kyber registered Jan 27 04:45:58.679792 kernel: io scheduler bfq registered Jan 27 04:45:58.679801 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 27 04:45:58.679885 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 27 04:45:58.679967 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 27 04:45:58.680055 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.680166 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 27 04:45:58.680256 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 27 04:45:58.680336 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.680417 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 27 04:45:58.680497 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 27 04:45:58.680582 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.680662 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 27 04:45:58.680760 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 27 04:45:58.680839 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.680919 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 27 04:45:58.681009 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 27 04:45:58.681102 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.681195 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 27 04:45:58.681276 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 27 04:45:58.681357 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.681443 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 27 04:45:58.681521 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 27 04:45:58.681601 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.681691 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 27 04:45:58.681773 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 27 04:45:58.681851 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.681862 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 27 04:45:58.681941 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 27 04:45:58.682021 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 27 04:45:58.682115 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.682210 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 27 04:45:58.682310 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 27 04:45:58.682391 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.682472 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 27 04:45:58.682550 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 27 04:45:58.682631 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.682713 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 27 04:45:58.682796 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 27 04:45:58.682874 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.682955 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 27 04:45:58.683034 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 27 04:45:58.683128 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.683215 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 27 04:45:58.683293 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 27 04:45:58.683372 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.683452 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 27 04:45:58.683530 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 27 04:45:58.683608 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.683691 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 27 04:45:58.683772 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 27 04:45:58.683851 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.683862 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 27 04:45:58.683940 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 27 04:45:58.684019 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 27 04:45:58.684121 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.684208 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 27 04:45:58.684287 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 27 04:45:58.684367 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.684447 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 27 04:45:58.684525 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 27 04:45:58.684603 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.684687 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 27 04:45:58.684765 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 27 04:45:58.684843 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.684922 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 27 04:45:58.684999 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 27 04:45:58.685078 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.685176 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 27 04:45:58.685255 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 27 04:45:58.685334 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.685414 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 27 04:45:58.685492 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 27 04:45:58.685570 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.685653 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 27 04:45:58.685731 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 27 04:45:58.685810 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.685821 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 27 04:45:58.685900 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 27 04:45:58.685979 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 27 04:45:58.686057 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.686162 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 27 04:45:58.686243 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 27 04:45:58.686338 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.686421 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 27 04:45:58.686502 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 27 04:45:58.686581 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.686665 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 27 04:45:58.686744 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 27 04:45:58.686823 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.686905 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 27 04:45:58.686985 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 27 04:45:58.687064 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.687166 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 27 04:45:58.687250 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 27 04:45:58.687330 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.687410 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 27 04:45:58.687489 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 27 04:45:58.687567 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.687651 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 27 04:45:58.687729 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 27 04:45:58.687808 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.687890 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 27 04:45:58.687971 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 27 04:45:58.688051 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 27 04:45:58.688062 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 27 04:45:58.688072 kernel: ACPI: button: Power Button [PWRB] Jan 27 04:45:58.688182 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 27 04:45:58.688270 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 27 04:45:58.688281 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 27 04:45:58.688290 kernel: thunder_xcv, ver 1.0 Jan 27 04:45:58.688298 kernel: thunder_bgx, ver 1.0 Jan 27 04:45:58.688306 kernel: nicpf, ver 1.0 Jan 27 04:45:58.688317 kernel: nicvf, ver 1.0 Jan 27 04:45:58.688419 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 27 04:45:58.688498 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-27T04:45:57 UTC (1769489157) Jan 27 04:45:58.688508 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 27 04:45:58.688517 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 27 04:45:58.688525 kernel: watchdog: NMI not fully supported Jan 27 04:45:58.688535 kernel: watchdog: Hard watchdog permanently disabled Jan 27 04:45:58.688543 kernel: NET: Registered PF_INET6 protocol family Jan 27 04:45:58.688551 kernel: Segment Routing with IPv6 Jan 27 04:45:58.688559 kernel: In-situ OAM (IOAM) with IPv6 Jan 27 04:45:58.688567 kernel: NET: Registered PF_PACKET protocol family Jan 27 04:45:58.688575 kernel: Key type dns_resolver registered Jan 27 04:45:58.688583 kernel: registered taskstats version 1 Jan 27 04:45:58.688593 kernel: Loading compiled-in X.509 certificates Jan 27 04:45:58.688602 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 0bb2f5fda59c0df79f00060b17ba27a9107482bd' Jan 27 04:45:58.688610 kernel: Demotion targets for Node 0: null Jan 27 04:45:58.688618 kernel: Key type .fscrypt registered Jan 27 04:45:58.688626 kernel: Key type fscrypt-provisioning registered Jan 27 04:45:58.688634 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 27 04:45:58.688642 kernel: ima: Allocated hash algorithm: sha1 Jan 27 04:45:58.688650 kernel: ima: No architecture policies found Jan 27 04:45:58.688660 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 27 04:45:58.688668 kernel: clk: Disabling unused clocks Jan 27 04:45:58.688676 kernel: PM: genpd: Disabling unused power domains Jan 27 04:45:58.688685 kernel: Freeing unused kernel memory: 12480K Jan 27 04:45:58.688693 kernel: Run /init as init process Jan 27 04:45:58.688701 kernel: with arguments: Jan 27 04:45:58.688709 kernel: /init Jan 27 04:45:58.688718 kernel: with environment: Jan 27 04:45:58.688725 kernel: HOME=/ Jan 27 04:45:58.688733 kernel: TERM=linux Jan 27 04:45:58.688741 kernel: ACPI: bus type USB registered Jan 27 04:45:58.688749 kernel: usbcore: registered new interface driver usbfs Jan 27 04:45:58.688757 kernel: usbcore: registered new interface driver hub Jan 27 04:45:58.688765 kernel: usbcore: registered new device driver usb Jan 27 04:45:58.688849 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 27 04:45:58.688932 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 27 04:45:58.689016 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 27 04:45:58.689112 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 27 04:45:58.689199 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 27 04:45:58.689282 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 27 04:45:58.689398 kernel: hub 1-0:1.0: USB hub found Jan 27 04:45:58.689516 kernel: hub 1-0:1.0: 4 ports detected Jan 27 04:45:58.689620 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 27 04:45:58.689719 kernel: hub 2-0:1.0: USB hub found Jan 27 04:45:58.689807 kernel: hub 2-0:1.0: 4 ports detected Jan 27 04:45:58.689901 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 27 04:45:58.689983 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 27 04:45:58.689995 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 27 04:45:58.690010 kernel: GPT:25804799 != 104857599 Jan 27 04:45:58.690019 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 27 04:45:58.690027 kernel: GPT:25804799 != 104857599 Jan 27 04:45:58.690037 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 27 04:45:58.690046 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 27 04:45:58.690054 kernel: SCSI subsystem initialized Jan 27 04:45:58.690063 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 27 04:45:58.690071 kernel: device-mapper: uevent: version 1.0.3 Jan 27 04:45:58.690080 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 27 04:45:58.690098 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 27 04:45:58.690110 kernel: raid6: neonx8 gen() 15764 MB/s Jan 27 04:45:58.690118 kernel: raid6: neonx4 gen() 15717 MB/s Jan 27 04:45:58.690126 kernel: raid6: neonx2 gen() 13290 MB/s Jan 27 04:45:58.690134 kernel: raid6: neonx1 gen() 10444 MB/s Jan 27 04:45:58.690143 kernel: raid6: int64x8 gen() 6832 MB/s Jan 27 04:45:58.690151 kernel: raid6: int64x4 gen() 7360 MB/s Jan 27 04:45:58.690159 kernel: raid6: int64x2 gen() 6120 MB/s Jan 27 04:45:58.690168 kernel: raid6: int64x1 gen() 5059 MB/s Jan 27 04:45:58.690178 kernel: raid6: using algorithm neonx8 gen() 15764 MB/s Jan 27 04:45:58.690314 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 27 04:45:58.690329 kernel: raid6: .... xor() 11839 MB/s, rmw enabled Jan 27 04:45:58.690338 kernel: raid6: using neon recovery algorithm Jan 27 04:45:58.690350 kernel: xor: measuring software checksum speed Jan 27 04:45:58.690360 kernel: 8regs : 21647 MB/sec Jan 27 04:45:58.690369 kernel: 32regs : 20197 MB/sec Jan 27 04:45:58.690378 kernel: arm64_neon : 28138 MB/sec Jan 27 04:45:58.690387 kernel: xor: using function: arm64_neon (28138 MB/sec) Jan 27 04:45:58.690395 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 27 04:45:58.690403 kernel: BTRFS: device fsid 7ff6c3b1-3a4d-48e1-b3ff-f7f4bd3516d8 devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (275) Jan 27 04:45:58.690412 kernel: BTRFS info (device dm-0): first mount of filesystem 7ff6c3b1-3a4d-48e1-b3ff-f7f4bd3516d8 Jan 27 04:45:58.690422 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 27 04:45:58.690436 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 27 04:45:58.690445 kernel: BTRFS info (device dm-0): enabling free space tree Jan 27 04:45:58.690469 kernel: loop: module loaded Jan 27 04:45:58.690478 kernel: loop0: detected capacity change from 0 to 91832 Jan 27 04:45:58.690486 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 27 04:45:58.690496 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 27 04:45:58.690644 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 27 04:45:58.690662 kernel: usbcore: registered new interface driver usbhid Jan 27 04:45:58.690670 kernel: usbhid: USB HID core driver Jan 27 04:45:58.690778 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 27 04:45:58.690791 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 27 04:45:58.690896 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 27 04:45:58.690916 systemd[1]: Successfully made /usr/ read-only. Jan 27 04:45:58.690928 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 27 04:45:58.690937 systemd[1]: Detected virtualization kvm. Jan 27 04:45:58.690946 systemd[1]: Detected architecture arm64. Jan 27 04:45:58.690955 systemd[1]: Running in initrd. Jan 27 04:45:58.690968 systemd[1]: No hostname configured, using default hostname. Jan 27 04:45:58.690977 systemd[1]: Hostname set to . Jan 27 04:45:58.690986 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 27 04:45:58.690994 systemd[1]: Queued start job for default target initrd.target. Jan 27 04:45:58.691006 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 27 04:45:58.691015 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 27 04:45:58.691024 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 27 04:45:58.691036 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 27 04:45:58.691049 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 27 04:45:58.691058 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 27 04:45:58.691067 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 27 04:45:58.691079 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 27 04:45:58.691104 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 27 04:45:58.691115 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 27 04:45:58.691126 systemd[1]: Reached target paths.target - Path Units. Jan 27 04:45:58.691135 systemd[1]: Reached target slices.target - Slice Units. Jan 27 04:45:58.691147 systemd[1]: Reached target swap.target - Swaps. Jan 27 04:45:58.691156 systemd[1]: Reached target timers.target - Timer Units. Jan 27 04:45:58.691165 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 27 04:45:58.691176 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 27 04:45:58.691188 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 27 04:45:58.691197 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 27 04:45:58.691206 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 27 04:45:58.691214 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 27 04:45:58.691226 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 27 04:45:58.691235 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 27 04:45:58.691245 systemd[1]: Reached target sockets.target - Socket Units. Jan 27 04:45:58.691256 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 27 04:45:58.691267 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 27 04:45:58.691275 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 27 04:45:58.691286 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 27 04:45:58.691297 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 27 04:45:58.691309 systemd[1]: Starting systemd-fsck-usr.service... Jan 27 04:45:58.691319 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 27 04:45:58.691328 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 27 04:45:58.691339 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 04:45:58.691351 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 27 04:45:58.691360 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 27 04:45:58.691371 systemd[1]: Finished systemd-fsck-usr.service. Jan 27 04:45:58.691380 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 27 04:45:58.691389 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 27 04:45:58.691424 systemd-journald[418]: Collecting audit messages is enabled. Jan 27 04:45:58.691446 kernel: Bridge firewalling registered Jan 27 04:45:58.691455 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 27 04:45:58.691465 kernel: audit: type=1130 audit(1769489158.621:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.691474 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 27 04:45:58.691484 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 04:45:58.691493 kernel: audit: type=1130 audit(1769489158.626:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.691502 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 27 04:45:58.691511 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 27 04:45:58.691520 kernel: audit: type=1130 audit(1769489158.639:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.691529 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 27 04:45:58.691540 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 27 04:45:58.691548 kernel: audit: type=1130 audit(1769489158.651:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.691557 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 27 04:45:58.691566 kernel: audit: type=1334 audit(1769489158.653:6): prog-id=6 op=LOAD Jan 27 04:45:58.691574 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 27 04:45:58.691584 kernel: audit: type=1130 audit(1769489158.659:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.691594 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 27 04:45:58.691603 kernel: audit: type=1130 audit(1769489158.674:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.691612 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 27 04:45:58.691621 systemd-journald[418]: Journal started Jan 27 04:45:58.691641 systemd-journald[418]: Runtime Journal (/run/log/journal/c8af9703fdbc4ad5971b70043d56a90e) is 8M, max 319.5M, 311.5M free. Jan 27 04:45:58.621000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.626000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.651000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.653000 audit: BPF prog-id=6 op=LOAD Jan 27 04:45:58.659000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.674000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.618318 systemd-modules-load[421]: Inserted module 'br_netfilter' Jan 27 04:45:58.693329 systemd[1]: Started systemd-journald.service - Journal Service. Jan 27 04:45:58.693000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.696034 dracut-cmdline[449]: dracut-109 Jan 27 04:45:58.697369 kernel: audit: type=1130 audit(1769489158.693:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.697604 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 27 04:45:58.702319 dracut-cmdline[449]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=e8d94e976e545d7a75a81392e4f736b09fc9f1bd0b9dfe995e69ba02d19f509a Jan 27 04:45:58.715158 systemd-resolved[446]: Positive Trust Anchors: Jan 27 04:45:58.715184 systemd-resolved[446]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 27 04:45:58.715187 systemd-resolved[446]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 27 04:45:58.715218 systemd-resolved[446]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 27 04:45:58.717216 systemd-tmpfiles[467]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 27 04:45:58.726000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.723593 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 27 04:45:58.732122 kernel: audit: type=1130 audit(1769489158.726:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.746614 systemd-resolved[446]: Defaulting to hostname 'linux'. Jan 27 04:45:58.747541 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 27 04:45:58.748482 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 27 04:45:58.748000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.797132 kernel: Loading iSCSI transport class v2.0-870. Jan 27 04:45:58.814118 kernel: iscsi: registered transport (tcp) Jan 27 04:45:58.829157 kernel: iscsi: registered transport (qla4xxx) Jan 27 04:45:58.829193 kernel: QLogic iSCSI HBA Driver Jan 27 04:45:58.853758 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 27 04:45:58.878020 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 27 04:45:58.879000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.880717 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 27 04:45:58.922809 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 27 04:45:58.923000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.925113 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 27 04:45:58.926515 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 27 04:45:58.958029 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 27 04:45:58.958000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:58.960000 audit: BPF prog-id=7 op=LOAD Jan 27 04:45:58.960000 audit: BPF prog-id=8 op=LOAD Jan 27 04:45:58.961638 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 27 04:45:58.993622 systemd-udevd[686]: Using default interface naming scheme 'v257'. Jan 27 04:45:59.001338 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 27 04:45:59.002000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:59.003780 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 27 04:45:59.029328 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 27 04:45:59.030000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:59.031000 audit: BPF prog-id=9 op=LOAD Jan 27 04:45:59.034114 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 27 04:45:59.036031 dracut-pre-trigger[761]: rd.md=0: removing MD RAID activation Jan 27 04:45:59.060692 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 27 04:45:59.061000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:59.062887 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 27 04:45:59.079689 systemd-networkd[802]: lo: Link UP Jan 27 04:45:59.079698 systemd-networkd[802]: lo: Gained carrier Jan 27 04:45:59.080145 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 27 04:45:59.081000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:59.081239 systemd[1]: Reached target network.target - Network. Jan 27 04:45:59.149358 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 27 04:45:59.150000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:59.155356 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 27 04:45:59.244349 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 27 04:45:59.262970 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 27 04:45:59.273280 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 27 04:45:59.274888 systemd-networkd[802]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 04:45:59.274891 systemd-networkd[802]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 27 04:45:59.275428 systemd-networkd[802]: eth0: Link UP Jan 27 04:45:59.279195 systemd-networkd[802]: eth0: Gained carrier Jan 27 04:45:59.279208 systemd-networkd[802]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 04:45:59.281695 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 27 04:45:59.287349 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 27 04:45:59.288480 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 27 04:45:59.289000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:59.288599 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 04:45:59.290016 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 04:45:59.306825 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 04:45:59.317745 disk-uuid[876]: Primary Header is updated. Jan 27 04:45:59.317745 disk-uuid[876]: Secondary Entries is updated. Jan 27 04:45:59.317745 disk-uuid[876]: Secondary Header is updated. Jan 27 04:45:59.324170 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 27 04:45:59.324000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:59.325563 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 27 04:45:59.329023 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 27 04:45:59.334411 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 27 04:45:59.340245 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 27 04:45:59.350270 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 04:45:59.351000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:45:59.368213 systemd-networkd[802]: eth0: DHCPv4 address 10.0.3.32/25, gateway 10.0.3.1 acquired from 10.0.3.1 Jan 27 04:45:59.369479 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 27 04:45:59.370000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:00.375080 disk-uuid[881]: Warning: The kernel is still using the old partition table. Jan 27 04:46:00.375080 disk-uuid[881]: The new table will be used at the next reboot or after you Jan 27 04:46:00.375080 disk-uuid[881]: run partprobe(8) or kpartx(8) Jan 27 04:46:00.375080 disk-uuid[881]: The operation has completed successfully. Jan 27 04:46:00.380503 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 27 04:46:00.381000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:00.381000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:00.380615 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 27 04:46:00.382522 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 27 04:46:00.439130 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (905) Jan 27 04:46:00.442112 kernel: BTRFS info (device vda6): first mount of filesystem 4aa71fd1-d094-4275-9bfb-293ba022a649 Jan 27 04:46:00.442162 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 27 04:46:00.458130 kernel: BTRFS info (device vda6): turning on async discard Jan 27 04:46:00.458172 kernel: BTRFS info (device vda6): enabling free space tree Jan 27 04:46:00.463110 kernel: BTRFS info (device vda6): last unmount of filesystem 4aa71fd1-d094-4275-9bfb-293ba022a649 Jan 27 04:46:00.463969 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 27 04:46:00.464000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:00.466134 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 27 04:46:00.595237 systemd-networkd[802]: eth0: Gained IPv6LL Jan 27 04:46:00.681376 ignition[924]: Ignition 2.24.0 Jan 27 04:46:00.681455 ignition[924]: Stage: fetch-offline Jan 27 04:46:00.682415 ignition[924]: no configs at "/usr/lib/ignition/base.d" Jan 27 04:46:00.683021 ignition[924]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 04:46:00.683213 ignition[924]: parsed url from cmdline: "" Jan 27 04:46:00.683216 ignition[924]: no config URL provided Jan 27 04:46:00.683665 ignition[924]: reading system config file "/usr/lib/ignition/user.ign" Jan 27 04:46:00.686000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:00.685685 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 27 04:46:00.683674 ignition[924]: no config at "/usr/lib/ignition/user.ign" Jan 27 04:46:00.687939 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 27 04:46:00.683679 ignition[924]: failed to fetch config: resource requires networking Jan 27 04:46:00.683841 ignition[924]: Ignition finished successfully Jan 27 04:46:00.713079 ignition[935]: Ignition 2.24.0 Jan 27 04:46:00.713113 ignition[935]: Stage: fetch Jan 27 04:46:00.713250 ignition[935]: no configs at "/usr/lib/ignition/base.d" Jan 27 04:46:00.713258 ignition[935]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 04:46:00.713332 ignition[935]: parsed url from cmdline: "" Jan 27 04:46:00.713335 ignition[935]: no config URL provided Jan 27 04:46:00.713339 ignition[935]: reading system config file "/usr/lib/ignition/user.ign" Jan 27 04:46:00.713344 ignition[935]: no config at "/usr/lib/ignition/user.ign" Jan 27 04:46:00.713499 ignition[935]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 27 04:46:00.713878 ignition[935]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 27 04:46:00.713922 ignition[935]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 27 04:46:01.714138 ignition[935]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 27 04:46:01.714164 ignition[935]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 27 04:46:02.714624 ignition[935]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 27 04:46:02.714673 ignition[935]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 27 04:46:03.715354 ignition[935]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 27 04:46:03.715415 ignition[935]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 27 04:46:04.715595 ignition[935]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 27 04:46:04.715695 ignition[935]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 27 04:46:05.715851 ignition[935]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 27 04:46:05.715917 ignition[935]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 27 04:46:05.899305 ignition[935]: GET result: OK Jan 27 04:46:05.899526 ignition[935]: parsing config with SHA512: 270b10ce96a93cdf02c7eb432f642b8e976f02c5188b5d3e5188d695734c3351fe12f99c3ae86d2581a1e56886b32ed1eaedc1aa0e85d4153ad345f672675fcb Jan 27 04:46:05.904417 unknown[935]: fetched base config from "system" Jan 27 04:46:05.904429 unknown[935]: fetched base config from "system" Jan 27 04:46:05.904777 ignition[935]: fetch: fetch complete Jan 27 04:46:05.904434 unknown[935]: fetched user config from "openstack" Jan 27 04:46:05.904782 ignition[935]: fetch: fetch passed Jan 27 04:46:05.911761 kernel: kauditd_printk_skb: 20 callbacks suppressed Jan 27 04:46:05.911785 kernel: audit: type=1130 audit(1769489165.908:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:05.908000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:05.907147 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 27 04:46:05.904821 ignition[935]: Ignition finished successfully Jan 27 04:46:05.910526 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 27 04:46:05.933655 ignition[943]: Ignition 2.24.0 Jan 27 04:46:05.933675 ignition[943]: Stage: kargs Jan 27 04:46:05.933820 ignition[943]: no configs at "/usr/lib/ignition/base.d" Jan 27 04:46:05.933828 ignition[943]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 04:46:05.934580 ignition[943]: kargs: kargs passed Jan 27 04:46:05.934623 ignition[943]: Ignition finished successfully Jan 27 04:46:05.938247 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 27 04:46:05.939000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:05.940407 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 27 04:46:05.943449 kernel: audit: type=1130 audit(1769489165.939:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:05.969071 ignition[950]: Ignition 2.24.0 Jan 27 04:46:05.969106 ignition[950]: Stage: disks Jan 27 04:46:05.969260 ignition[950]: no configs at "/usr/lib/ignition/base.d" Jan 27 04:46:05.969269 ignition[950]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 04:46:05.971977 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 27 04:46:05.974000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:05.970012 ignition[950]: disks: disks passed Jan 27 04:46:05.979314 kernel: audit: type=1130 audit(1769489165.974:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:05.974647 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 27 04:46:05.970054 ignition[950]: Ignition finished successfully Jan 27 04:46:05.978797 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 27 04:46:05.980288 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 27 04:46:05.981983 systemd[1]: Reached target sysinit.target - System Initialization. Jan 27 04:46:05.983522 systemd[1]: Reached target basic.target - Basic System. Jan 27 04:46:05.986087 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 27 04:46:06.041674 systemd-fsck[960]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 27 04:46:06.046162 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 27 04:46:06.047000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:06.048378 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 27 04:46:06.052295 kernel: audit: type=1130 audit(1769489166.047:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:06.242129 kernel: EXT4-fs (vda9): mounted filesystem 2db79406-f76c-40e5-9b6b-2c9503021d45 r/w with ordered data mode. Quota mode: none. Jan 27 04:46:06.242634 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 27 04:46:06.243725 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 27 04:46:06.247926 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 27 04:46:06.250750 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 27 04:46:06.251608 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 27 04:46:06.259026 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 27 04:46:06.260075 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 27 04:46:06.260119 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 27 04:46:06.261869 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 27 04:46:06.264182 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 27 04:46:06.289119 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (969) Jan 27 04:46:06.295291 kernel: BTRFS info (device vda6): first mount of filesystem 4aa71fd1-d094-4275-9bfb-293ba022a649 Jan 27 04:46:06.295354 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 27 04:46:06.304130 kernel: BTRFS info (device vda6): turning on async discard Jan 27 04:46:06.304182 kernel: BTRFS info (device vda6): enabling free space tree Jan 27 04:46:06.305218 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 27 04:46:06.357108 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:06.553517 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 27 04:46:06.557189 kernel: audit: type=1130 audit(1769489166.554:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:06.554000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:06.555308 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 27 04:46:06.558553 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 27 04:46:06.573066 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 27 04:46:06.574673 kernel: BTRFS info (device vda6): last unmount of filesystem 4aa71fd1-d094-4275-9bfb-293ba022a649 Jan 27 04:46:06.592152 ignition[1070]: INFO : Ignition 2.24.0 Jan 27 04:46:06.592152 ignition[1070]: INFO : Stage: mount Jan 27 04:46:06.594657 ignition[1070]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 27 04:46:06.594657 ignition[1070]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 04:46:06.594657 ignition[1070]: INFO : mount: mount passed Jan 27 04:46:06.594657 ignition[1070]: INFO : Ignition finished successfully Jan 27 04:46:06.597000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:06.596944 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 27 04:46:06.601849 kernel: audit: type=1130 audit(1769489166.597:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:06.605810 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 27 04:46:06.606000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:06.610116 kernel: audit: type=1130 audit(1769489166.606:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:07.443129 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:09.452115 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:13.457119 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:13.464895 coreos-metadata[971]: Jan 27 04:46:13.464 WARN failed to locate config-drive, using the metadata service API instead Jan 27 04:46:13.483744 coreos-metadata[971]: Jan 27 04:46:13.483 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 27 04:46:14.825132 coreos-metadata[971]: Jan 27 04:46:14.825 INFO Fetch successful Jan 27 04:46:14.826123 coreos-metadata[971]: Jan 27 04:46:14.825 INFO wrote hostname ci-4592-0-0-n-c2731c5fad to /sysroot/etc/hostname Jan 27 04:46:14.827909 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 27 04:46:14.828927 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 27 04:46:14.831000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:14.832334 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 27 04:46:14.836820 kernel: audit: type=1130 audit(1769489174.831:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:14.836845 kernel: audit: type=1131 audit(1769489174.831:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:14.831000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:14.863591 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 27 04:46:14.893127 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1087) Jan 27 04:46:14.897156 kernel: BTRFS info (device vda6): first mount of filesystem 4aa71fd1-d094-4275-9bfb-293ba022a649 Jan 27 04:46:14.897241 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 27 04:46:14.903481 kernel: BTRFS info (device vda6): turning on async discard Jan 27 04:46:14.903561 kernel: BTRFS info (device vda6): enabling free space tree Jan 27 04:46:14.905238 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 27 04:46:14.930575 ignition[1105]: INFO : Ignition 2.24.0 Jan 27 04:46:14.930575 ignition[1105]: INFO : Stage: files Jan 27 04:46:14.932044 ignition[1105]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 27 04:46:14.932044 ignition[1105]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 04:46:14.932044 ignition[1105]: DEBUG : files: compiled without relabeling support, skipping Jan 27 04:46:14.935667 ignition[1105]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 27 04:46:14.935667 ignition[1105]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 27 04:46:14.947175 ignition[1105]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 27 04:46:14.948413 ignition[1105]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 27 04:46:14.948413 ignition[1105]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 27 04:46:14.947842 unknown[1105]: wrote ssh authorized keys file for user: core Jan 27 04:46:14.958821 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jan 27 04:46:14.961164 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Jan 27 04:46:15.020136 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 27 04:46:15.133916 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jan 27 04:46:15.133916 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 27 04:46:15.137228 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 27 04:46:15.151222 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 27 04:46:15.151222 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 27 04:46:15.151222 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Jan 27 04:46:15.589776 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 27 04:46:16.139550 ignition[1105]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 27 04:46:16.139550 ignition[1105]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 27 04:46:16.144847 ignition[1105]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 27 04:46:16.148097 ignition[1105]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 27 04:46:16.150825 ignition[1105]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 27 04:46:16.150825 ignition[1105]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 27 04:46:16.150825 ignition[1105]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 27 04:46:16.150825 ignition[1105]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 27 04:46:16.150825 ignition[1105]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 27 04:46:16.150825 ignition[1105]: INFO : files: files passed Jan 27 04:46:16.150825 ignition[1105]: INFO : Ignition finished successfully Jan 27 04:46:16.162374 kernel: audit: type=1130 audit(1769489176.153:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.153000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.152049 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 27 04:46:16.154763 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 27 04:46:16.158445 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 27 04:46:16.169194 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 27 04:46:16.171140 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 27 04:46:16.171000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.171000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.176662 kernel: audit: type=1130 audit(1769489176.171:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.176731 kernel: audit: type=1131 audit(1769489176.171:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.178662 initrd-setup-root-after-ignition[1139]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 27 04:46:16.178662 initrd-setup-root-after-ignition[1139]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 27 04:46:16.181291 initrd-setup-root-after-ignition[1143]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 27 04:46:16.182000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.180863 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 27 04:46:16.187809 kernel: audit: type=1130 audit(1769489176.182:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.182646 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 27 04:46:16.187763 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 27 04:46:16.229086 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 27 04:46:16.229232 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 27 04:46:16.236071 kernel: audit: type=1130 audit(1769489176.230:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.236167 kernel: audit: type=1131 audit(1769489176.230:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.230000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.230000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.231061 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 27 04:46:16.236891 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 27 04:46:16.238629 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 27 04:46:16.239575 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 27 04:46:16.279258 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 27 04:46:16.280000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.281581 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 27 04:46:16.284960 kernel: audit: type=1130 audit(1769489176.280:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.301924 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 27 04:46:16.302141 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 27 04:46:16.304289 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 27 04:46:16.305917 systemd[1]: Stopped target timers.target - Timer Units. Jan 27 04:46:16.307390 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 27 04:46:16.308000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.307526 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 27 04:46:16.312638 kernel: audit: type=1131 audit(1769489176.308:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.311699 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 27 04:46:16.313455 systemd[1]: Stopped target basic.target - Basic System. Jan 27 04:46:16.314791 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 27 04:46:16.316118 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 27 04:46:16.317729 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 27 04:46:16.319278 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 27 04:46:16.320834 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 27 04:46:16.322300 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 27 04:46:16.324064 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 27 04:46:16.325709 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 27 04:46:16.327048 systemd[1]: Stopped target swap.target - Swaps. Jan 27 04:46:16.328301 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 27 04:46:16.329000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.328435 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 27 04:46:16.330284 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 27 04:46:16.331845 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 27 04:46:16.333366 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 27 04:46:16.337192 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 27 04:46:16.338568 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 27 04:46:16.340000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.338697 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 27 04:46:16.341181 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 27 04:46:16.342000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.341303 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 27 04:46:16.344000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.342920 systemd[1]: ignition-files.service: Deactivated successfully. Jan 27 04:46:16.343020 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 27 04:46:16.345286 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 27 04:46:16.346617 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 27 04:46:16.348000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.346747 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 27 04:46:16.348965 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 27 04:46:16.352000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.350274 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 27 04:46:16.353000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.350400 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 27 04:46:16.355000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.352153 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 27 04:46:16.352260 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 27 04:46:16.353684 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 27 04:46:16.353783 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 27 04:46:16.361373 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 27 04:46:16.362109 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 27 04:46:16.363000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.369160 ignition[1163]: INFO : Ignition 2.24.0 Jan 27 04:46:16.369160 ignition[1163]: INFO : Stage: umount Jan 27 04:46:16.370762 ignition[1163]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 27 04:46:16.370762 ignition[1163]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 27 04:46:16.370762 ignition[1163]: INFO : umount: umount passed Jan 27 04:46:16.370762 ignition[1163]: INFO : Ignition finished successfully Jan 27 04:46:16.373000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.371396 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 27 04:46:16.372191 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 27 04:46:16.376000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.374445 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 27 04:46:16.378000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.375144 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 27 04:46:16.379000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.375232 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 27 04:46:16.376961 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 27 04:46:16.382000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.377013 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 27 04:46:16.378391 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 27 04:46:16.378451 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 27 04:46:16.379722 systemd[1]: Stopped target network.target - Network. Jan 27 04:46:16.380889 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 27 04:46:16.380939 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 27 04:46:16.382445 systemd[1]: Stopped target paths.target - Path Units. Jan 27 04:46:16.383657 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 27 04:46:16.393000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.383723 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 27 04:46:16.394000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.385134 systemd[1]: Stopped target slices.target - Slice Units. Jan 27 04:46:16.386414 systemd[1]: Stopped target sockets.target - Socket Units. Jan 27 04:46:16.387852 systemd[1]: iscsid.socket: Deactivated successfully. Jan 27 04:46:16.399000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.387893 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 27 04:46:16.389110 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 27 04:46:16.401000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.389144 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 27 04:46:16.390823 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 27 04:46:16.390845 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 27 04:46:16.392378 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 27 04:46:16.392432 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 27 04:46:16.393692 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 27 04:46:16.393734 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 27 04:46:16.395306 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 27 04:46:16.397410 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 27 04:46:16.398925 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 27 04:46:16.399028 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 27 04:46:16.400473 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 27 04:46:16.400581 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 27 04:46:16.413313 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 27 04:46:16.413460 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 27 04:46:16.414000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.417134 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 27 04:46:16.417235 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 27 04:46:16.418000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.422071 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 27 04:46:16.422000 audit: BPF prog-id=6 op=UNLOAD Jan 27 04:46:16.423000 audit: BPF prog-id=9 op=UNLOAD Jan 27 04:46:16.424163 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 27 04:46:16.424229 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 27 04:46:16.425986 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 27 04:46:16.427402 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 27 04:46:16.428000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.427465 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 27 04:46:16.430000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.429081 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 27 04:46:16.432000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.429137 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 27 04:46:16.430660 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 27 04:46:16.430702 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 27 04:46:16.432430 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 27 04:46:16.445675 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 27 04:46:16.447000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.446587 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 27 04:46:16.447820 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 27 04:46:16.447857 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 27 04:46:16.451000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.448823 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 27 04:46:16.448854 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 27 04:46:16.454000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.450622 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 27 04:46:16.456000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.450674 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 27 04:46:16.452847 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 27 04:46:16.452893 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 27 04:46:16.455100 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 27 04:46:16.455149 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 27 04:46:16.461000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.462000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.458237 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 27 04:46:16.464000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.459735 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 27 04:46:16.459807 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 27 04:46:16.461352 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 27 04:46:16.461399 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 27 04:46:16.462905 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 27 04:46:16.462950 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 04:46:16.469000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.465448 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 27 04:46:16.468198 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 27 04:46:16.474236 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 27 04:46:16.474361 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 27 04:46:16.475000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.475000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:16.476157 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 27 04:46:16.478273 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 27 04:46:16.499633 systemd[1]: Switching root. Jan 27 04:46:16.525236 systemd-journald[418]: Journal stopped Jan 27 04:46:17.743309 systemd-journald[418]: Received SIGTERM from PID 1 (systemd). Jan 27 04:46:17.743390 kernel: SELinux: policy capability network_peer_controls=1 Jan 27 04:46:17.743407 kernel: SELinux: policy capability open_perms=1 Jan 27 04:46:17.743420 kernel: SELinux: policy capability extended_socket_class=1 Jan 27 04:46:17.743435 kernel: SELinux: policy capability always_check_network=0 Jan 27 04:46:17.743449 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 27 04:46:17.743461 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 27 04:46:17.743478 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 27 04:46:17.743493 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 27 04:46:17.743503 kernel: SELinux: policy capability userspace_initial_context=0 Jan 27 04:46:17.743513 systemd[1]: Successfully loaded SELinux policy in 103.135ms. Jan 27 04:46:17.743531 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.726ms. Jan 27 04:46:17.743544 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 27 04:46:17.743557 systemd[1]: Detected virtualization kvm. Jan 27 04:46:17.743570 systemd[1]: Detected architecture arm64. Jan 27 04:46:17.743581 systemd[1]: Detected first boot. Jan 27 04:46:17.743595 systemd[1]: Hostname set to . Jan 27 04:46:17.743609 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 27 04:46:17.743620 zram_generator::config[1208]: No configuration found. Jan 27 04:46:17.743637 kernel: NET: Registered PF_VSOCK protocol family Jan 27 04:46:17.743648 systemd[1]: Populated /etc with preset unit settings. Jan 27 04:46:17.743659 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 27 04:46:17.743670 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 27 04:46:17.743710 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 27 04:46:17.743724 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 27 04:46:17.743735 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 27 04:46:17.743745 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 27 04:46:17.743758 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 27 04:46:17.743769 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 27 04:46:17.743780 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 27 04:46:17.743791 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 27 04:46:17.743802 systemd[1]: Created slice user.slice - User and Session Slice. Jan 27 04:46:17.743813 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 27 04:46:17.743825 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 27 04:46:17.743953 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 27 04:46:17.743972 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 27 04:46:17.743984 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 27 04:46:17.743995 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 27 04:46:17.744007 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 27 04:46:17.744017 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 27 04:46:17.744030 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 27 04:46:17.744041 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 27 04:46:17.744052 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 27 04:46:17.744063 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 27 04:46:17.744074 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 27 04:46:17.744085 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 27 04:46:17.744119 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 27 04:46:17.744132 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 27 04:46:17.744144 systemd[1]: Reached target slices.target - Slice Units. Jan 27 04:46:17.744155 systemd[1]: Reached target swap.target - Swaps. Jan 27 04:46:17.744166 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 27 04:46:17.744178 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 27 04:46:17.744189 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 27 04:46:17.744202 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 27 04:46:17.744214 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 27 04:46:17.744225 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 27 04:46:17.744238 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 27 04:46:17.744255 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 27 04:46:17.744266 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 27 04:46:17.744280 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 27 04:46:17.744292 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 27 04:46:17.744303 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 27 04:46:17.744313 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 27 04:46:17.744324 systemd[1]: Mounting media.mount - External Media Directory... Jan 27 04:46:17.744335 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 27 04:46:17.744346 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 27 04:46:17.744358 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 27 04:46:17.744371 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 27 04:46:17.744383 systemd[1]: Reached target machines.target - Containers. Jan 27 04:46:17.744394 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 27 04:46:17.744406 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 27 04:46:17.744417 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 27 04:46:17.744429 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 27 04:46:17.744440 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 27 04:46:17.744453 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 27 04:46:17.744464 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 27 04:46:17.744475 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 27 04:46:17.744488 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 27 04:46:17.744499 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 27 04:46:17.744510 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 27 04:46:17.744521 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 27 04:46:17.744532 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 27 04:46:17.744543 systemd[1]: Stopped systemd-fsck-usr.service. Jan 27 04:46:17.744555 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 27 04:46:17.744568 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 27 04:46:17.744579 kernel: fuse: init (API version 7.41) Jan 27 04:46:17.744650 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 27 04:46:17.744667 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 27 04:46:17.744678 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 27 04:46:17.744689 kernel: ACPI: bus type drm_connector registered Jan 27 04:46:17.744700 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 27 04:46:17.744715 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 27 04:46:17.744726 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 27 04:46:17.744737 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 27 04:46:17.744784 systemd-journald[1271]: Collecting audit messages is enabled. Jan 27 04:46:17.744818 systemd[1]: Mounted media.mount - External Media Directory. Jan 27 04:46:17.744831 systemd-journald[1271]: Journal started Jan 27 04:46:17.744855 systemd-journald[1271]: Runtime Journal (/run/log/journal/c8af9703fdbc4ad5971b70043d56a90e) is 8M, max 319.5M, 311.5M free. Jan 27 04:46:17.609000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 27 04:46:17.689000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.691000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.696000 audit: BPF prog-id=14 op=UNLOAD Jan 27 04:46:17.696000 audit: BPF prog-id=13 op=UNLOAD Jan 27 04:46:17.697000 audit: BPF prog-id=15 op=LOAD Jan 27 04:46:17.697000 audit: BPF prog-id=16 op=LOAD Jan 27 04:46:17.697000 audit: BPF prog-id=17 op=LOAD Jan 27 04:46:17.741000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 27 04:46:17.741000 audit[1271]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=5 a1=ffffff15e140 a2=4000 a3=0 items=0 ppid=1 pid=1271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:17.741000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 27 04:46:17.515831 systemd[1]: Queued start job for default target multi-user.target. Jan 27 04:46:17.541459 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 27 04:46:17.541905 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 27 04:46:17.747467 systemd[1]: Started systemd-journald.service - Journal Service. Jan 27 04:46:17.747000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.747961 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 27 04:46:17.750329 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 27 04:46:17.751565 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 27 04:46:17.754132 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 27 04:46:17.755000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.756286 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 27 04:46:17.756462 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 27 04:46:17.759000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.759000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.759756 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 27 04:46:17.759946 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 27 04:46:17.760000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.760000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.762000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.762000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.761290 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 27 04:46:17.761452 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 27 04:46:17.762650 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 27 04:46:17.762808 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 27 04:46:17.763000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.763000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.764331 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 27 04:46:17.764495 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 27 04:46:17.765000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.765000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.765873 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 27 04:46:17.766032 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 27 04:46:17.766000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.767000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.767414 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 27 04:46:17.768000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.770250 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 27 04:46:17.771000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.772393 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 27 04:46:17.773000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.774064 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 27 04:46:17.776000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.784462 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 27 04:46:17.785000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.789153 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 27 04:46:17.790378 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 27 04:46:17.792575 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 27 04:46:17.794494 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 27 04:46:17.795397 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 27 04:46:17.795425 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 27 04:46:17.797020 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 27 04:46:17.798290 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 27 04:46:17.798400 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 27 04:46:17.813279 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 27 04:46:17.816260 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 27 04:46:17.817335 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 27 04:46:17.818403 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 27 04:46:17.819363 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 27 04:46:17.821410 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 27 04:46:17.826383 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 27 04:46:17.828691 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 27 04:46:17.829536 systemd-journald[1271]: Time spent on flushing to /var/log/journal/c8af9703fdbc4ad5971b70043d56a90e is 24.999ms for 1822 entries. Jan 27 04:46:17.829536 systemd-journald[1271]: System Journal (/var/log/journal/c8af9703fdbc4ad5971b70043d56a90e) is 8M, max 588.1M, 580.1M free. Jan 27 04:46:17.863411 systemd-journald[1271]: Received client request to flush runtime journal. Jan 27 04:46:17.863460 kernel: loop1: detected capacity change from 0 to 100192 Jan 27 04:46:17.832000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.860000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.832147 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 27 04:46:17.834407 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 27 04:46:17.835803 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 27 04:46:17.859196 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 27 04:46:17.860485 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 27 04:46:17.862631 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 27 04:46:17.869000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.868165 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 27 04:46:17.879961 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 27 04:46:17.883000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.965806 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 27 04:46:17.966000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.972126 kernel: loop2: detected capacity change from 0 to 207008 Jan 27 04:46:17.992768 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 27 04:46:17.993000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:17.994000 audit: BPF prog-id=18 op=LOAD Jan 27 04:46:17.994000 audit: BPF prog-id=19 op=LOAD Jan 27 04:46:17.994000 audit: BPF prog-id=20 op=LOAD Jan 27 04:46:17.995797 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 27 04:46:17.997000 audit: BPF prog-id=21 op=LOAD Jan 27 04:46:17.998060 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 27 04:46:17.999831 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 27 04:46:18.005000 audit: BPF prog-id=22 op=LOAD Jan 27 04:46:18.005000 audit: BPF prog-id=23 op=LOAD Jan 27 04:46:18.005000 audit: BPF prog-id=24 op=LOAD Jan 27 04:46:18.008000 audit: BPF prog-id=25 op=LOAD Jan 27 04:46:18.008000 audit: BPF prog-id=26 op=LOAD Jan 27 04:46:18.008000 audit: BPF prog-id=27 op=LOAD Jan 27 04:46:18.006693 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 27 04:46:18.008925 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 27 04:46:18.039674 systemd-nsresourced[1353]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 27 04:46:18.040238 systemd-tmpfiles[1352]: ACLs are not supported, ignoring. Jan 27 04:46:18.040259 systemd-tmpfiles[1352]: ACLs are not supported, ignoring. Jan 27 04:46:18.041704 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 27 04:46:18.042000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:18.050000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:18.049566 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 27 04:46:18.053230 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 27 04:46:18.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:18.085126 kernel: loop3: detected capacity change from 0 to 45344 Jan 27 04:46:18.108408 systemd-oomd[1350]: No swap; memory pressure usage will be degraded Jan 27 04:46:18.109449 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 27 04:46:18.110000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:18.115986 systemd-resolved[1351]: Positive Trust Anchors: Jan 27 04:46:18.116009 systemd-resolved[1351]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 27 04:46:18.116013 systemd-resolved[1351]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 27 04:46:18.116045 systemd-resolved[1351]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 27 04:46:18.127057 systemd-resolved[1351]: Using system hostname 'ci-4592-0-0-n-c2731c5fad'. Jan 27 04:46:18.128554 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 27 04:46:18.130000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:18.130576 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 27 04:46:18.177145 kernel: loop4: detected capacity change from 0 to 1648 Jan 27 04:46:18.219134 kernel: loop5: detected capacity change from 0 to 100192 Jan 27 04:46:18.291142 kernel: loop6: detected capacity change from 0 to 207008 Jan 27 04:46:18.362123 kernel: loop7: detected capacity change from 0 to 45344 Jan 27 04:46:18.366877 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 27 04:46:18.367000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:18.368000 audit: BPF prog-id=8 op=UNLOAD Jan 27 04:46:18.368000 audit: BPF prog-id=7 op=UNLOAD Jan 27 04:46:18.368000 audit: BPF prog-id=28 op=LOAD Jan 27 04:46:18.368000 audit: BPF prog-id=29 op=LOAD Jan 27 04:46:18.369562 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 27 04:46:18.406138 kernel: loop1: detected capacity change from 0 to 1648 Jan 27 04:46:18.408250 systemd-udevd[1377]: Using default interface naming scheme 'v257'. Jan 27 04:46:18.422654 (sd-merge)[1375]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 27 04:46:18.425787 (sd-merge)[1375]: Merged extensions into '/usr'. Jan 27 04:46:18.430373 systemd[1]: Reload requested from client PID 1329 ('systemd-sysext') (unit systemd-sysext.service)... Jan 27 04:46:18.430391 systemd[1]: Reloading... Jan 27 04:46:18.480131 zram_generator::config[1407]: No configuration found. Jan 27 04:46:18.563184 kernel: mousedev: PS/2 mouse device common for all mice Jan 27 04:46:18.654158 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 27 04:46:18.655330 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 27 04:46:18.655396 kernel: [drm] features: -context_init Jan 27 04:46:18.700053 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 27 04:46:18.701408 kernel: [drm] number of scanouts: 1 Jan 27 04:46:18.701470 kernel: [drm] number of cap sets: 0 Jan 27 04:46:18.701982 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 27 04:46:18.702070 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 27 04:46:18.702214 systemd[1]: Reloading finished in 271 ms. Jan 27 04:46:18.704210 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 27 04:46:18.708115 kernel: Console: switching to colour frame buffer device 160x50 Jan 27 04:46:18.722151 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 27 04:46:18.726175 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 27 04:46:18.727000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:18.727902 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 27 04:46:18.729000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:18.765738 systemd[1]: Starting ensure-sysext.service... Jan 27 04:46:18.767459 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 27 04:46:18.769000 audit: BPF prog-id=30 op=LOAD Jan 27 04:46:18.769888 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 27 04:46:18.771833 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 27 04:46:18.773730 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 27 04:46:18.775000 audit: BPF prog-id=31 op=LOAD Jan 27 04:46:18.775000 audit: BPF prog-id=25 op=UNLOAD Jan 27 04:46:18.775000 audit: BPF prog-id=32 op=LOAD Jan 27 04:46:18.775000 audit: BPF prog-id=33 op=LOAD Jan 27 04:46:18.775000 audit: BPF prog-id=26 op=UNLOAD Jan 27 04:46:18.775000 audit: BPF prog-id=27 op=UNLOAD Jan 27 04:46:18.775000 audit: BPF prog-id=34 op=LOAD Jan 27 04:46:18.775000 audit: BPF prog-id=21 op=UNLOAD Jan 27 04:46:18.775000 audit: BPF prog-id=35 op=LOAD Jan 27 04:46:18.775000 audit: BPF prog-id=36 op=LOAD Jan 27 04:46:18.775000 audit: BPF prog-id=28 op=UNLOAD Jan 27 04:46:18.775000 audit: BPF prog-id=29 op=UNLOAD Jan 27 04:46:18.777000 audit: BPF prog-id=37 op=LOAD Jan 27 04:46:18.777000 audit: BPF prog-id=15 op=UNLOAD Jan 27 04:46:18.777000 audit: BPF prog-id=38 op=LOAD Jan 27 04:46:18.777000 audit: BPF prog-id=39 op=LOAD Jan 27 04:46:18.777000 audit: BPF prog-id=16 op=UNLOAD Jan 27 04:46:18.777000 audit: BPF prog-id=17 op=UNLOAD Jan 27 04:46:18.778000 audit: BPF prog-id=40 op=LOAD Jan 27 04:46:18.778000 audit: BPF prog-id=18 op=UNLOAD Jan 27 04:46:18.778000 audit: BPF prog-id=41 op=LOAD Jan 27 04:46:18.778000 audit: BPF prog-id=42 op=LOAD Jan 27 04:46:18.778000 audit: BPF prog-id=19 op=UNLOAD Jan 27 04:46:18.778000 audit: BPF prog-id=20 op=UNLOAD Jan 27 04:46:18.778000 audit: BPF prog-id=43 op=LOAD Jan 27 04:46:18.778000 audit: BPF prog-id=22 op=UNLOAD Jan 27 04:46:18.778000 audit: BPF prog-id=44 op=LOAD Jan 27 04:46:18.778000 audit: BPF prog-id=45 op=LOAD Jan 27 04:46:18.778000 audit: BPF prog-id=23 op=UNLOAD Jan 27 04:46:18.778000 audit: BPF prog-id=24 op=UNLOAD Jan 27 04:46:18.784344 systemd[1]: Reload requested from client PID 1493 ('systemctl') (unit ensure-sysext.service)... Jan 27 04:46:18.784360 systemd[1]: Reloading... Jan 27 04:46:18.793245 systemd-tmpfiles[1496]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 27 04:46:18.793372 systemd-tmpfiles[1496]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 27 04:46:18.793653 systemd-tmpfiles[1496]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 27 04:46:18.795353 systemd-tmpfiles[1496]: ACLs are not supported, ignoring. Jan 27 04:46:18.795450 systemd-tmpfiles[1496]: ACLs are not supported, ignoring. Jan 27 04:46:18.810339 systemd-tmpfiles[1496]: Detected autofs mount point /boot during canonicalization of boot. Jan 27 04:46:18.810352 systemd-tmpfiles[1496]: Skipping /boot Jan 27 04:46:18.816559 systemd-tmpfiles[1496]: Detected autofs mount point /boot during canonicalization of boot. Jan 27 04:46:18.816578 systemd-tmpfiles[1496]: Skipping /boot Jan 27 04:46:18.842186 zram_generator::config[1537]: No configuration found. Jan 27 04:46:18.862507 systemd-networkd[1495]: lo: Link UP Jan 27 04:46:18.862524 systemd-networkd[1495]: lo: Gained carrier Jan 27 04:46:18.864085 systemd-networkd[1495]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 04:46:18.864116 systemd-networkd[1495]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 27 04:46:18.865574 systemd-networkd[1495]: eth0: Link UP Jan 27 04:46:18.865727 systemd-networkd[1495]: eth0: Gained carrier Jan 27 04:46:18.865741 systemd-networkd[1495]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 27 04:46:18.881157 systemd-networkd[1495]: eth0: DHCPv4 address 10.0.3.32/25, gateway 10.0.3.1 acquired from 10.0.3.1 Jan 27 04:46:19.008180 systemd[1]: Reloading finished in 223 ms. Jan 27 04:46:19.025305 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 27 04:46:19.026000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.027000 audit: BPF prog-id=46 op=LOAD Jan 27 04:46:19.027000 audit: BPF prog-id=40 op=UNLOAD Jan 27 04:46:19.027000 audit: BPF prog-id=47 op=LOAD Jan 27 04:46:19.027000 audit: BPF prog-id=48 op=LOAD Jan 27 04:46:19.027000 audit: BPF prog-id=41 op=UNLOAD Jan 27 04:46:19.027000 audit: BPF prog-id=42 op=UNLOAD Jan 27 04:46:19.028000 audit: BPF prog-id=49 op=LOAD Jan 27 04:46:19.028000 audit: BPF prog-id=43 op=UNLOAD Jan 27 04:46:19.028000 audit: BPF prog-id=50 op=LOAD Jan 27 04:46:19.028000 audit: BPF prog-id=51 op=LOAD Jan 27 04:46:19.028000 audit: BPF prog-id=44 op=UNLOAD Jan 27 04:46:19.028000 audit: BPF prog-id=45 op=UNLOAD Jan 27 04:46:19.028000 audit: BPF prog-id=52 op=LOAD Jan 27 04:46:19.028000 audit: BPF prog-id=34 op=UNLOAD Jan 27 04:46:19.029000 audit: BPF prog-id=53 op=LOAD Jan 27 04:46:19.029000 audit: BPF prog-id=30 op=UNLOAD Jan 27 04:46:19.029000 audit: BPF prog-id=54 op=LOAD Jan 27 04:46:19.029000 audit: BPF prog-id=55 op=LOAD Jan 27 04:46:19.029000 audit: BPF prog-id=35 op=UNLOAD Jan 27 04:46:19.029000 audit: BPF prog-id=36 op=UNLOAD Jan 27 04:46:19.030000 audit: BPF prog-id=56 op=LOAD Jan 27 04:46:19.030000 audit: BPF prog-id=37 op=UNLOAD Jan 27 04:46:19.030000 audit: BPF prog-id=57 op=LOAD Jan 27 04:46:19.030000 audit: BPF prog-id=58 op=LOAD Jan 27 04:46:19.030000 audit: BPF prog-id=38 op=UNLOAD Jan 27 04:46:19.030000 audit: BPF prog-id=39 op=UNLOAD Jan 27 04:46:19.051000 audit: BPF prog-id=59 op=LOAD Jan 27 04:46:19.051000 audit: BPF prog-id=31 op=UNLOAD Jan 27 04:46:19.051000 audit: BPF prog-id=60 op=LOAD Jan 27 04:46:19.051000 audit: BPF prog-id=61 op=LOAD Jan 27 04:46:19.051000 audit: BPF prog-id=32 op=UNLOAD Jan 27 04:46:19.051000 audit: BPF prog-id=33 op=UNLOAD Jan 27 04:46:19.053283 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 27 04:46:19.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.055266 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 27 04:46:19.056000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.057738 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 27 04:46:19.058000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.066053 systemd[1]: Reached target network.target - Network. Jan 27 04:46:19.068231 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 27 04:46:19.085461 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 27 04:46:19.087856 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 27 04:46:19.089919 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 27 04:46:19.093224 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 27 04:46:19.096827 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 27 04:46:19.099648 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 27 04:46:19.103402 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 27 04:46:19.107930 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 27 04:46:19.113341 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 27 04:46:19.115923 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 27 04:46:19.115000 audit[1589]: SYSTEM_BOOT pid=1589 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.123812 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 27 04:46:19.123995 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 27 04:46:19.124082 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 27 04:46:19.133188 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 27 04:46:19.134000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-persistent-storage comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.134672 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 27 04:46:19.135000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.136222 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 27 04:46:19.136397 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 27 04:46:19.137000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.137000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.137927 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 27 04:46:19.138079 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 27 04:46:19.139000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.139000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.139802 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 27 04:46:19.139992 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 27 04:46:19.141000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.141000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.148707 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 27 04:46:19.150082 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 27 04:46:19.152002 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 27 04:46:19.153805 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 27 04:46:19.161323 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 27 04:46:19.163018 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 27 04:46:19.164118 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 27 04:46:19.164220 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 27 04:46:19.164251 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 27 04:46:19.164295 systemd[1]: Reached target time-set.target - System Time Set. Jan 27 04:46:19.165809 systemd[1]: Finished ensure-sysext.service. Jan 27 04:46:19.168000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.168640 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 27 04:46:19.169000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.169952 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 27 04:46:19.170208 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 27 04:46:19.171000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.171000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.171545 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 27 04:46:19.171721 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 27 04:46:19.172000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.172000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.173253 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 27 04:46:19.173405 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 27 04:46:19.174000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.174000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.174905 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 27 04:46:19.175055 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 27 04:46:19.177153 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 27 04:46:19.177213 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 27 04:46:19.177000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.177000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:19.182273 kernel: PTP clock support registered Jan 27 04:46:19.181814 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 27 04:46:19.181892 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 27 04:46:19.185526 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 27 04:46:19.185770 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 27 04:46:19.185000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 27 04:46:19.185000 audit[1623]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffffb593410 a2=420 a3=0 items=0 ppid=1577 pid=1623 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:19.185000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 04:46:19.187142 augenrules[1623]: No rules Jan 27 04:46:19.188196 systemd[1]: audit-rules.service: Deactivated successfully. Jan 27 04:46:19.188411 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 27 04:46:19.314131 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 27 04:46:19.315797 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 27 04:46:20.073652 ldconfig[1579]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 27 04:46:20.113031 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 27 04:46:20.115554 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 27 04:46:20.145785 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 27 04:46:20.147062 systemd[1]: Reached target sysinit.target - System Initialization. Jan 27 04:46:20.147990 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 27 04:46:20.149041 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 27 04:46:20.150265 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 27 04:46:20.151191 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 27 04:46:20.152149 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 27 04:46:20.153158 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 27 04:46:20.153974 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 27 04:46:20.155024 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 27 04:46:20.155058 systemd[1]: Reached target paths.target - Path Units. Jan 27 04:46:20.156083 systemd[1]: Reached target timers.target - Timer Units. Jan 27 04:46:20.157310 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 27 04:46:20.159983 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 27 04:46:20.162754 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 27 04:46:20.163946 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 27 04:46:20.165048 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 27 04:46:20.171285 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 27 04:46:20.172402 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 27 04:46:20.173889 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 27 04:46:20.174918 systemd[1]: Reached target sockets.target - Socket Units. Jan 27 04:46:20.175744 systemd[1]: Reached target basic.target - Basic System. Jan 27 04:46:20.176522 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 27 04:46:20.176554 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 27 04:46:20.181511 systemd[1]: Starting chronyd.service - NTP client/server... Jan 27 04:46:20.183139 systemd[1]: Starting containerd.service - containerd container runtime... Jan 27 04:46:20.185039 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 27 04:46:20.187215 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 27 04:46:20.189810 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 27 04:46:20.194120 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:20.194229 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 27 04:46:20.197377 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 27 04:46:20.198181 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 27 04:46:20.206086 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 27 04:46:20.209053 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 27 04:46:20.211438 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 27 04:46:20.215278 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 27 04:46:20.218177 jq[1643]: false Jan 27 04:46:20.218844 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 27 04:46:20.220009 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 27 04:46:20.220483 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 27 04:46:20.222388 systemd[1]: Starting update-engine.service - Update Engine... Jan 27 04:46:20.225430 chronyd[1636]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 27 04:46:20.225773 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 27 04:46:20.228569 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 27 04:46:20.229937 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 27 04:46:20.230188 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 27 04:46:20.230576 jq[1656]: true Jan 27 04:46:20.231158 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 27 04:46:20.231372 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 27 04:46:20.237142 extend-filesystems[1644]: Found /dev/vda6 Jan 27 04:46:20.236923 chronyd[1636]: Loaded seccomp filter (level 2) Jan 27 04:46:20.237519 systemd[1]: Started chronyd.service - NTP client/server. Jan 27 04:46:20.242703 extend-filesystems[1644]: Found /dev/vda9 Jan 27 04:46:20.245269 extend-filesystems[1644]: Checking size of /dev/vda9 Jan 27 04:46:20.245953 systemd[1]: motdgen.service: Deactivated successfully. Jan 27 04:46:20.247565 jq[1665]: true Jan 27 04:46:20.248129 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 27 04:46:20.270856 extend-filesystems[1644]: Resized partition /dev/vda9 Jan 27 04:46:20.271725 tar[1660]: linux-arm64/LICENSE Jan 27 04:46:20.271725 tar[1660]: linux-arm64/helm Jan 27 04:46:20.274445 update_engine[1654]: I20260127 04:46:20.273527 1654 main.cc:92] Flatcar Update Engine starting Jan 27 04:46:20.279626 extend-filesystems[1696]: resize2fs 1.47.3 (8-Jul-2025) Jan 27 04:46:20.290209 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 27 04:46:20.327585 dbus-daemon[1639]: [system] SELinux support is enabled Jan 27 04:46:20.328069 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 27 04:46:20.332051 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 27 04:46:20.332084 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 27 04:46:20.347126 update_engine[1654]: I20260127 04:46:20.342053 1654 update_check_scheduler.cc:74] Next update check in 6m5s Jan 27 04:46:20.333832 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 27 04:46:20.333849 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 27 04:46:20.339998 systemd-logind[1653]: New seat seat0. Jan 27 04:46:20.341761 systemd[1]: Started update-engine.service - Update Engine. Jan 27 04:46:20.345674 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 27 04:46:20.346663 systemd-logind[1653]: Watching system buttons on /dev/input/event0 (Power Button) Jan 27 04:46:20.346680 systemd-logind[1653]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 27 04:46:20.348070 systemd[1]: Started systemd-logind.service - User Login Management. Jan 27 04:46:20.371288 systemd-networkd[1495]: eth0: Gained IPv6LL Jan 27 04:46:20.377131 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 27 04:46:20.379341 systemd[1]: Reached target network-online.target - Network is Online. Jan 27 04:46:20.382485 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 04:46:20.385388 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 27 04:46:20.424350 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 27 04:46:20.432411 locksmithd[1708]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 27 04:46:20.547953 containerd[1666]: time="2026-01-27T04:46:20Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 27 04:46:20.552297 containerd[1666]: time="2026-01-27T04:46:20.552240160Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 27 04:46:20.570500 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 27 04:46:20.572643 containerd[1666]: time="2026-01-27T04:46:20.572600680Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.84µs" Jan 27 04:46:20.572694 containerd[1666]: time="2026-01-27T04:46:20.572644320Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 27 04:46:20.572694 containerd[1666]: time="2026-01-27T04:46:20.572688640Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 27 04:46:20.572728 containerd[1666]: time="2026-01-27T04:46:20.572699800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 27 04:46:20.572857 containerd[1666]: time="2026-01-27T04:46:20.572835400Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 27 04:46:20.572881 containerd[1666]: time="2026-01-27T04:46:20.572857160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 27 04:46:20.572933 containerd[1666]: time="2026-01-27T04:46:20.572915280Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 27 04:46:20.572933 containerd[1666]: time="2026-01-27T04:46:20.572930560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 27 04:46:20.573820 containerd[1666]: time="2026-01-27T04:46:20.573786720Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 27 04:46:20.573850 containerd[1666]: time="2026-01-27T04:46:20.573819320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 27 04:46:20.573850 containerd[1666]: time="2026-01-27T04:46:20.573833000Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 27 04:46:20.573850 containerd[1666]: time="2026-01-27T04:46:20.573841640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 27 04:46:20.574016 containerd[1666]: time="2026-01-27T04:46:20.573994360Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 27 04:46:20.574016 containerd[1666]: time="2026-01-27T04:46:20.574015080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 27 04:46:20.574118 containerd[1666]: time="2026-01-27T04:46:20.574087120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 27 04:46:20.574307 containerd[1666]: time="2026-01-27T04:46:20.574286160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 27 04:46:20.574344 containerd[1666]: time="2026-01-27T04:46:20.574327800Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 27 04:46:20.574344 containerd[1666]: time="2026-01-27T04:46:20.574341960Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 27 04:46:20.574399 containerd[1666]: time="2026-01-27T04:46:20.574374400Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 27 04:46:20.574601 bash[1707]: Updated "/home/core/.ssh/authorized_keys" Jan 27 04:46:20.574788 containerd[1666]: time="2026-01-27T04:46:20.574587560Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 27 04:46:20.574788 containerd[1666]: time="2026-01-27T04:46:20.574658840Z" level=info msg="metadata content store policy set" policy=shared Jan 27 04:46:20.576633 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 27 04:46:20.582231 systemd[1]: Starting sshkeys.service... Jan 27 04:46:20.600024 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 27 04:46:20.602228 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 27 04:46:20.623794 containerd[1666]: time="2026-01-27T04:46:20.623749080Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 27 04:46:20.623868 containerd[1666]: time="2026-01-27T04:46:20.623820200Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 27 04:46:20.623951 containerd[1666]: time="2026-01-27T04:46:20.623928240Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 27 04:46:20.623951 containerd[1666]: time="2026-01-27T04:46:20.623948360Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 27 04:46:20.624017 containerd[1666]: time="2026-01-27T04:46:20.623963280Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 27 04:46:20.624017 containerd[1666]: time="2026-01-27T04:46:20.623982520Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 27 04:46:20.624017 containerd[1666]: time="2026-01-27T04:46:20.623995440Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 27 04:46:20.624017 containerd[1666]: time="2026-01-27T04:46:20.624005600Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 27 04:46:20.624017 containerd[1666]: time="2026-01-27T04:46:20.624016960Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 27 04:46:20.624113 containerd[1666]: time="2026-01-27T04:46:20.624029200Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 27 04:46:20.624113 containerd[1666]: time="2026-01-27T04:46:20.624039920Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 27 04:46:20.624113 containerd[1666]: time="2026-01-27T04:46:20.624050640Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 27 04:46:20.624113 containerd[1666]: time="2026-01-27T04:46:20.624062280Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 27 04:46:20.624113 containerd[1666]: time="2026-01-27T04:46:20.624075040Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624238160Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624280360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624299640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624310600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624325920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624335520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624347000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624357800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624368760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624379040Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624388680Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624415720Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624451320Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624471360Z" level=info msg="Start snapshots syncer" Jan 27 04:46:20.625504 containerd[1666]: time="2026-01-27T04:46:20.624495680Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 27 04:46:20.626236 containerd[1666]: time="2026-01-27T04:46:20.625057640Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 27 04:46:20.626236 containerd[1666]: time="2026-01-27T04:46:20.625652560Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 27 04:46:20.626444 containerd[1666]: time="2026-01-27T04:46:20.625713960Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 27 04:46:20.627054 containerd[1666]: time="2026-01-27T04:46:20.626945960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 27 04:46:20.627054 containerd[1666]: time="2026-01-27T04:46:20.627007640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 27 04:46:20.627054 containerd[1666]: time="2026-01-27T04:46:20.627021560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 27 04:46:20.627260 containerd[1666]: time="2026-01-27T04:46:20.627042000Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 27 04:46:20.627260 containerd[1666]: time="2026-01-27T04:46:20.627209800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 27 04:46:20.627260 containerd[1666]: time="2026-01-27T04:46:20.627234720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 27 04:46:20.627352 containerd[1666]: time="2026-01-27T04:46:20.627335400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 27 04:46:20.627488 containerd[1666]: time="2026-01-27T04:46:20.627414600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 27 04:46:20.627488 containerd[1666]: time="2026-01-27T04:46:20.627441520Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 27 04:46:20.627578 containerd[1666]: time="2026-01-27T04:46:20.627564040Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 27 04:46:20.627652 containerd[1666]: time="2026-01-27T04:46:20.627633080Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 27 04:46:20.627713 containerd[1666]: time="2026-01-27T04:46:20.627690720Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 27 04:46:20.627768 containerd[1666]: time="2026-01-27T04:46:20.627754880Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 27 04:46:20.628009 containerd[1666]: time="2026-01-27T04:46:20.627827800Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 27 04:46:20.628009 containerd[1666]: time="2026-01-27T04:46:20.627848480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 27 04:46:20.628009 containerd[1666]: time="2026-01-27T04:46:20.627863840Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 27 04:46:20.628167 containerd[1666]: time="2026-01-27T04:46:20.628112400Z" level=info msg="runtime interface created" Jan 27 04:46:20.628167 containerd[1666]: time="2026-01-27T04:46:20.628128200Z" level=info msg="created NRI interface" Jan 27 04:46:20.628167 containerd[1666]: time="2026-01-27T04:46:20.628141080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 27 04:46:20.628273 containerd[1666]: time="2026-01-27T04:46:20.628243360Z" level=info msg="Connect containerd service" Jan 27 04:46:20.628481 containerd[1666]: time="2026-01-27T04:46:20.628353600Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 27 04:46:20.630151 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:20.630976 containerd[1666]: time="2026-01-27T04:46:20.630945880Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 27 04:46:20.757958 containerd[1666]: time="2026-01-27T04:46:20.757722280Z" level=info msg="Start subscribing containerd event" Jan 27 04:46:20.757958 containerd[1666]: time="2026-01-27T04:46:20.757881280Z" level=info msg="Start recovering state" Jan 27 04:46:20.758198 containerd[1666]: time="2026-01-27T04:46:20.758178560Z" level=info msg="Start event monitor" Jan 27 04:46:20.758332 containerd[1666]: time="2026-01-27T04:46:20.758270600Z" level=info msg="Start cni network conf syncer for default" Jan 27 04:46:20.758332 containerd[1666]: time="2026-01-27T04:46:20.758285280Z" level=info msg="Start streaming server" Jan 27 04:46:20.758332 containerd[1666]: time="2026-01-27T04:46:20.758298160Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 27 04:46:20.758332 containerd[1666]: time="2026-01-27T04:46:20.758305240Z" level=info msg="runtime interface starting up..." Jan 27 04:46:20.758332 containerd[1666]: time="2026-01-27T04:46:20.758310520Z" level=info msg="starting plugins..." Jan 27 04:46:20.758674 containerd[1666]: time="2026-01-27T04:46:20.758602200Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 27 04:46:20.758674 containerd[1666]: time="2026-01-27T04:46:20.758632760Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 27 04:46:20.758749 containerd[1666]: time="2026-01-27T04:46:20.758692800Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 27 04:46:20.758956 containerd[1666]: time="2026-01-27T04:46:20.758934160Z" level=info msg="containerd successfully booted in 0.303258s" Jan 27 04:46:20.759122 systemd[1]: Started containerd.service - containerd container runtime. Jan 27 04:46:20.835956 tar[1660]: linux-arm64/README.md Jan 27 04:46:20.858370 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 27 04:46:20.930127 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 27 04:46:20.959775 extend-filesystems[1696]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 27 04:46:20.959775 extend-filesystems[1696]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 27 04:46:20.959775 extend-filesystems[1696]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 27 04:46:20.962802 extend-filesystems[1644]: Resized filesystem in /dev/vda9 Jan 27 04:46:20.962783 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 27 04:46:20.963012 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 27 04:46:21.203129 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:21.647120 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:21.706831 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 04:46:21.710169 sshd_keygen[1675]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 27 04:46:21.724646 (kubelet)[1766]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 04:46:21.730726 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 27 04:46:21.735816 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 27 04:46:21.738291 systemd[1]: Started sshd@0-10.0.3.32:22-4.153.228.146:44262.service - OpenSSH per-connection server daemon (4.153.228.146:44262). Jan 27 04:46:21.756284 systemd[1]: issuegen.service: Deactivated successfully. Jan 27 04:46:21.756579 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 27 04:46:21.760198 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 27 04:46:21.777747 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 27 04:46:21.781747 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 27 04:46:21.784492 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 27 04:46:21.785801 systemd[1]: Reached target getty.target - Login Prompts. Jan 27 04:46:22.319871 sshd[1776]: Accepted publickey for core from 4.153.228.146 port 44262 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:46:22.324274 sshd-session[1776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:22.335408 systemd-logind[1653]: New session 1 of user core. Jan 27 04:46:22.336699 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 27 04:46:22.338774 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 27 04:46:22.372267 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 27 04:46:22.375449 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 27 04:46:22.400308 (systemd)[1795]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:22.402732 systemd-logind[1653]: New session 2 of user core. Jan 27 04:46:22.569171 systemd[1795]: Queued start job for default target default.target. Jan 27 04:46:22.579625 kubelet[1766]: E0127 04:46:22.579522 1766 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 04:46:22.582254 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 04:46:22.582384 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 04:46:22.584736 systemd[1]: kubelet.service: Consumed 813ms CPU time, 257.7M memory peak. Jan 27 04:46:22.585607 systemd[1795]: Created slice app.slice - User Application Slice. Jan 27 04:46:22.585744 systemd[1795]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 27 04:46:22.585757 systemd[1795]: Reached target paths.target - Paths. Jan 27 04:46:22.585824 systemd[1795]: Reached target timers.target - Timers. Jan 27 04:46:22.587225 systemd[1795]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 27 04:46:22.588067 systemd[1795]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 27 04:46:22.597390 systemd[1795]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 27 04:46:22.597822 systemd[1795]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 27 04:46:22.597940 systemd[1795]: Reached target sockets.target - Sockets. Jan 27 04:46:22.597979 systemd[1795]: Reached target basic.target - Basic System. Jan 27 04:46:22.598008 systemd[1795]: Reached target default.target - Main User Target. Jan 27 04:46:22.598030 systemd[1795]: Startup finished in 189ms. Jan 27 04:46:22.598279 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 27 04:46:22.600630 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 27 04:46:22.913943 systemd[1]: Started sshd@1-10.0.3.32:22-4.153.228.146:44274.service - OpenSSH per-connection server daemon (4.153.228.146:44274). Jan 27 04:46:23.214158 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:23.446515 sshd[1810]: Accepted publickey for core from 4.153.228.146 port 44274 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:46:23.447835 sshd-session[1810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:23.452105 systemd-logind[1653]: New session 3 of user core. Jan 27 04:46:23.469597 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 27 04:46:23.658140 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:23.740082 sshd[1815]: Connection closed by 4.153.228.146 port 44274 Jan 27 04:46:23.740477 sshd-session[1810]: pam_unix(sshd:session): session closed for user core Jan 27 04:46:23.745281 systemd[1]: sshd@1-10.0.3.32:22-4.153.228.146:44274.service: Deactivated successfully. Jan 27 04:46:23.746983 systemd[1]: session-3.scope: Deactivated successfully. Jan 27 04:46:23.747728 systemd-logind[1653]: Session 3 logged out. Waiting for processes to exit. Jan 27 04:46:23.748831 systemd-logind[1653]: Removed session 3. Jan 27 04:46:23.848900 systemd[1]: Started sshd@2-10.0.3.32:22-4.153.228.146:44278.service - OpenSSH per-connection server daemon (4.153.228.146:44278). Jan 27 04:46:24.383051 sshd[1822]: Accepted publickey for core from 4.153.228.146 port 44278 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:46:24.384394 sshd-session[1822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:24.388112 systemd-logind[1653]: New session 4 of user core. Jan 27 04:46:24.395495 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 27 04:46:24.671318 sshd[1826]: Connection closed by 4.153.228.146 port 44278 Jan 27 04:46:24.671726 sshd-session[1822]: pam_unix(sshd:session): session closed for user core Jan 27 04:46:24.676341 systemd[1]: sshd@2-10.0.3.32:22-4.153.228.146:44278.service: Deactivated successfully. Jan 27 04:46:24.678010 systemd[1]: session-4.scope: Deactivated successfully. Jan 27 04:46:24.678771 systemd-logind[1653]: Session 4 logged out. Waiting for processes to exit. Jan 27 04:46:24.679651 systemd-logind[1653]: Removed session 4. Jan 27 04:46:27.226168 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:27.238133 coreos-metadata[1638]: Jan 27 04:46:27.235 WARN failed to locate config-drive, using the metadata service API instead Jan 27 04:46:27.255981 coreos-metadata[1638]: Jan 27 04:46:27.255 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 27 04:46:27.667135 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 27 04:46:27.673428 coreos-metadata[1736]: Jan 27 04:46:27.673 WARN failed to locate config-drive, using the metadata service API instead Jan 27 04:46:27.686371 coreos-metadata[1736]: Jan 27 04:46:27.686 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 27 04:46:30.142392 coreos-metadata[1736]: Jan 27 04:46:30.142 INFO Fetch successful Jan 27 04:46:30.142392 coreos-metadata[1736]: Jan 27 04:46:30.142 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 27 04:46:30.758761 coreos-metadata[1638]: Jan 27 04:46:30.758 INFO Fetch successful Jan 27 04:46:30.758761 coreos-metadata[1638]: Jan 27 04:46:30.758 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 27 04:46:31.449231 coreos-metadata[1736]: Jan 27 04:46:31.448 INFO Fetch successful Jan 27 04:46:31.453828 unknown[1736]: wrote ssh authorized keys file for user: core Jan 27 04:46:31.456597 coreos-metadata[1638]: Jan 27 04:46:31.456 INFO Fetch successful Jan 27 04:46:31.456597 coreos-metadata[1638]: Jan 27 04:46:31.456 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 27 04:46:31.484306 update-ssh-keys[1841]: Updated "/home/core/.ssh/authorized_keys" Jan 27 04:46:31.485208 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 27 04:46:31.486514 systemd[1]: Finished sshkeys.service. Jan 27 04:46:32.114273 coreos-metadata[1638]: Jan 27 04:46:32.114 INFO Fetch successful Jan 27 04:46:32.114273 coreos-metadata[1638]: Jan 27 04:46:32.114 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 27 04:46:32.768813 coreos-metadata[1638]: Jan 27 04:46:32.768 INFO Fetch successful Jan 27 04:46:32.768813 coreos-metadata[1638]: Jan 27 04:46:32.768 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 27 04:46:32.833327 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 27 04:46:32.834902 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 04:46:32.961724 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 04:46:32.965594 (kubelet)[1852]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 04:46:33.002389 kubelet[1852]: E0127 04:46:33.002337 1852 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 04:46:33.005270 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 04:46:33.005394 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 04:46:33.005953 systemd[1]: kubelet.service: Consumed 147ms CPU time, 107.5M memory peak. Jan 27 04:46:33.420284 coreos-metadata[1638]: Jan 27 04:46:33.420 INFO Fetch successful Jan 27 04:46:33.420284 coreos-metadata[1638]: Jan 27 04:46:33.420 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 27 04:46:34.079619 coreos-metadata[1638]: Jan 27 04:46:34.079 INFO Fetch successful Jan 27 04:46:34.103046 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 27 04:46:34.103669 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 27 04:46:34.103804 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 27 04:46:34.104527 systemd[1]: Startup finished in 2.548s (kernel) + 18.383s (initrd) + 17.472s (userspace) = 38.404s. Jan 27 04:46:34.781102 systemd[1]: Started sshd@3-10.0.3.32:22-4.153.228.146:55142.service - OpenSSH per-connection server daemon (4.153.228.146:55142). Jan 27 04:46:35.321141 sshd[1866]: Accepted publickey for core from 4.153.228.146 port 55142 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:46:35.323497 sshd-session[1866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:35.328167 systemd-logind[1653]: New session 5 of user core. Jan 27 04:46:35.346443 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 27 04:46:35.621813 sshd[1870]: Connection closed by 4.153.228.146 port 55142 Jan 27 04:46:35.622173 sshd-session[1866]: pam_unix(sshd:session): session closed for user core Jan 27 04:46:35.626044 systemd[1]: sshd@3-10.0.3.32:22-4.153.228.146:55142.service: Deactivated successfully. Jan 27 04:46:35.627655 systemd[1]: session-5.scope: Deactivated successfully. Jan 27 04:46:35.630723 systemd-logind[1653]: Session 5 logged out. Waiting for processes to exit. Jan 27 04:46:35.632041 systemd-logind[1653]: Removed session 5. Jan 27 04:46:35.732501 systemd[1]: Started sshd@4-10.0.3.32:22-4.153.228.146:39936.service - OpenSSH per-connection server daemon (4.153.228.146:39936). Jan 27 04:46:36.262237 sshd[1876]: Accepted publickey for core from 4.153.228.146 port 39936 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:46:36.263590 sshd-session[1876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:36.267540 systemd-logind[1653]: New session 6 of user core. Jan 27 04:46:36.282495 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 27 04:46:36.550034 sshd[1880]: Connection closed by 4.153.228.146 port 39936 Jan 27 04:46:36.550407 sshd-session[1876]: pam_unix(sshd:session): session closed for user core Jan 27 04:46:36.555965 systemd[1]: sshd@4-10.0.3.32:22-4.153.228.146:39936.service: Deactivated successfully. Jan 27 04:46:36.557596 systemd[1]: session-6.scope: Deactivated successfully. Jan 27 04:46:36.558288 systemd-logind[1653]: Session 6 logged out. Waiting for processes to exit. Jan 27 04:46:36.559264 systemd-logind[1653]: Removed session 6. Jan 27 04:46:36.658519 systemd[1]: Started sshd@5-10.0.3.32:22-4.153.228.146:39952.service - OpenSSH per-connection server daemon (4.153.228.146:39952). Jan 27 04:46:37.191686 sshd[1886]: Accepted publickey for core from 4.153.228.146 port 39952 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:46:37.193002 sshd-session[1886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:37.196989 systemd-logind[1653]: New session 7 of user core. Jan 27 04:46:37.214420 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 27 04:46:37.480050 sshd[1890]: Connection closed by 4.153.228.146 port 39952 Jan 27 04:46:37.480761 sshd-session[1886]: pam_unix(sshd:session): session closed for user core Jan 27 04:46:37.484742 systemd[1]: sshd@5-10.0.3.32:22-4.153.228.146:39952.service: Deactivated successfully. Jan 27 04:46:37.486409 systemd[1]: session-7.scope: Deactivated successfully. Jan 27 04:46:37.487130 systemd-logind[1653]: Session 7 logged out. Waiting for processes to exit. Jan 27 04:46:37.488064 systemd-logind[1653]: Removed session 7. Jan 27 04:46:37.589286 systemd[1]: Started sshd@6-10.0.3.32:22-4.153.228.146:39964.service - OpenSSH per-connection server daemon (4.153.228.146:39964). Jan 27 04:46:38.128133 sshd[1896]: Accepted publickey for core from 4.153.228.146 port 39964 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:46:38.129192 sshd-session[1896]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:38.133843 systemd-logind[1653]: New session 8 of user core. Jan 27 04:46:38.144471 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 27 04:46:38.344040 sudo[1901]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 27 04:46:38.344346 sudo[1901]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 04:46:38.361936 sudo[1901]: pam_unix(sudo:session): session closed for user root Jan 27 04:46:38.458209 sshd[1900]: Connection closed by 4.153.228.146 port 39964 Jan 27 04:46:38.458485 sshd-session[1896]: pam_unix(sshd:session): session closed for user core Jan 27 04:46:38.463330 systemd[1]: sshd@6-10.0.3.32:22-4.153.228.146:39964.service: Deactivated successfully. Jan 27 04:46:38.465420 systemd[1]: session-8.scope: Deactivated successfully. Jan 27 04:46:38.468206 systemd-logind[1653]: Session 8 logged out. Waiting for processes to exit. Jan 27 04:46:38.468992 systemd-logind[1653]: Removed session 8. Jan 27 04:46:38.570677 systemd[1]: Started sshd@7-10.0.3.32:22-4.153.228.146:39970.service - OpenSSH per-connection server daemon (4.153.228.146:39970). Jan 27 04:46:39.118053 sshd[1908]: Accepted publickey for core from 4.153.228.146 port 39970 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:46:39.119464 sshd-session[1908]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:39.123270 systemd-logind[1653]: New session 9 of user core. Jan 27 04:46:39.129320 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 27 04:46:39.321520 sudo[1914]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 27 04:46:39.322144 sudo[1914]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 04:46:39.328678 sudo[1914]: pam_unix(sudo:session): session closed for user root Jan 27 04:46:39.335060 sudo[1913]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 27 04:46:39.335350 sudo[1913]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 04:46:39.342854 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 27 04:46:39.386000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 27 04:46:39.388312 augenrules[1938]: No rules Jan 27 04:46:39.388469 kernel: kauditd_printk_skb: 199 callbacks suppressed Jan 27 04:46:39.388508 kernel: audit: type=1305 audit(1769489199.386:243): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 27 04:46:39.388787 systemd[1]: audit-rules.service: Deactivated successfully. Jan 27 04:46:39.389065 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 27 04:46:39.386000 audit[1938]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff3d59180 a2=420 a3=0 items=0 ppid=1919 pid=1938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:39.390933 sudo[1913]: pam_unix(sudo:session): session closed for user root Jan 27 04:46:39.393570 kernel: audit: type=1300 audit(1769489199.386:243): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff3d59180 a2=420 a3=0 items=0 ppid=1919 pid=1938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:39.386000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 04:46:39.395111 kernel: audit: type=1327 audit(1769489199.386:243): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 27 04:46:39.389000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.397540 kernel: audit: type=1130 audit(1769489199.389:244): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.389000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.399931 kernel: audit: type=1131 audit(1769489199.389:245): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.399968 kernel: audit: type=1106 audit(1769489199.390:246): pid=1913 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.390000 audit[1913]: USER_END pid=1913 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.402588 kernel: audit: type=1104 audit(1769489199.390:247): pid=1913 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.390000 audit[1913]: CRED_DISP pid=1913 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.490552 sshd[1912]: Connection closed by 4.153.228.146 port 39970 Jan 27 04:46:39.491073 sshd-session[1908]: pam_unix(sshd:session): session closed for user core Jan 27 04:46:39.492000 audit[1908]: USER_END pid=1908 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:46:39.495360 systemd[1]: sshd@7-10.0.3.32:22-4.153.228.146:39970.service: Deactivated successfully. Jan 27 04:46:39.492000 audit[1908]: CRED_DISP pid=1908 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:46:39.497583 systemd[1]: session-9.scope: Deactivated successfully. Jan 27 04:46:39.499112 kernel: audit: type=1106 audit(1769489199.492:248): pid=1908 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:46:39.499172 kernel: audit: type=1104 audit(1769489199.492:249): pid=1908 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:46:39.499192 kernel: audit: type=1131 audit(1769489199.496:250): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.3.32:22-4.153.228.146:39970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.496000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.3.32:22-4.153.228.146:39970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.499460 systemd-logind[1653]: Session 9 logged out. Waiting for processes to exit. Jan 27 04:46:39.500727 systemd-logind[1653]: Removed session 9. Jan 27 04:46:39.595000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.3.32:22-4.153.228.146:39974 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:39.595416 systemd[1]: Started sshd@8-10.0.3.32:22-4.153.228.146:39974.service - OpenSSH per-connection server daemon (4.153.228.146:39974). Jan 27 04:46:40.116000 audit[1947]: USER_ACCT pid=1947 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:46:40.116773 sshd[1947]: Accepted publickey for core from 4.153.228.146 port 39974 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:46:40.117000 audit[1947]: CRED_ACQ pid=1947 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:46:40.117000 audit[1947]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc97d2af0 a2=3 a3=0 items=0 ppid=1 pid=1947 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:40.117000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:46:40.118054 sshd-session[1947]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:46:40.122684 systemd-logind[1653]: New session 10 of user core. Jan 27 04:46:40.128551 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 27 04:46:40.130000 audit[1947]: USER_START pid=1947 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:46:40.132000 audit[1951]: CRED_ACQ pid=1951 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:46:40.313000 audit[1952]: USER_ACCT pid=1952 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:46:40.313000 audit[1952]: CRED_REFR pid=1952 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:46:40.313000 audit[1952]: USER_START pid=1952 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:46:40.313425 sudo[1952]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 27 04:46:40.313686 sudo[1952]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 27 04:46:40.644237 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 27 04:46:40.654774 (dockerd)[1973]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 27 04:46:40.928451 dockerd[1973]: time="2026-01-27T04:46:40.928310720Z" level=info msg="Starting up" Jan 27 04:46:40.929263 dockerd[1973]: time="2026-01-27T04:46:40.929237040Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 27 04:46:40.940193 dockerd[1973]: time="2026-01-27T04:46:40.940149200Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 27 04:46:40.979821 systemd[1]: var-lib-docker-metacopy\x2dcheck1546566176-merged.mount: Deactivated successfully. Jan 27 04:46:40.991218 dockerd[1973]: time="2026-01-27T04:46:40.991140360Z" level=info msg="Loading containers: start." Jan 27 04:46:41.006121 kernel: Initializing XFRM netlink socket Jan 27 04:46:41.062000 audit[2027]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2027 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.062000 audit[2027]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffcc7652d0 a2=0 a3=0 items=0 ppid=1973 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.062000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 27 04:46:41.064000 audit[2029]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.064000 audit[2029]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffda586b00 a2=0 a3=0 items=0 ppid=1973 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.064000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 27 04:46:41.065000 audit[2031]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.065000 audit[2031]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffebbceae0 a2=0 a3=0 items=0 ppid=1973 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.065000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 27 04:46:41.067000 audit[2033]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.067000 audit[2033]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff68c74d0 a2=0 a3=0 items=0 ppid=1973 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.067000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 27 04:46:41.069000 audit[2035]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.069000 audit[2035]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff57b9a30 a2=0 a3=0 items=0 ppid=1973 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.069000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 27 04:46:41.070000 audit[2037]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.070000 audit[2037]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffefa31790 a2=0 a3=0 items=0 ppid=1973 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.070000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 04:46:41.072000 audit[2039]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.072000 audit[2039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc5069940 a2=0 a3=0 items=0 ppid=1973 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.072000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 27 04:46:41.074000 audit[2041]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.074000 audit[2041]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=fffff532c650 a2=0 a3=0 items=0 ppid=1973 pid=2041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.074000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 27 04:46:41.104000 audit[2044]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2044 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.104000 audit[2044]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffddf57b70 a2=0 a3=0 items=0 ppid=1973 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.104000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 27 04:46:41.106000 audit[2046]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2046 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.106000 audit[2046]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff5dfda00 a2=0 a3=0 items=0 ppid=1973 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.106000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 27 04:46:41.107000 audit[2048]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.107000 audit[2048]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffe1e824e0 a2=0 a3=0 items=0 ppid=1973 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.107000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 27 04:46:41.109000 audit[2050]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.109000 audit[2050]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffd6f0e3e0 a2=0 a3=0 items=0 ppid=1973 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.109000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 04:46:41.111000 audit[2052]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.111000 audit[2052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffd1b84f20 a2=0 a3=0 items=0 ppid=1973 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.111000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 27 04:46:41.148000 audit[2082]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2082 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.148000 audit[2082]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffff20393b0 a2=0 a3=0 items=0 ppid=1973 pid=2082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.148000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 27 04:46:41.151000 audit[2084]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2084 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.151000 audit[2084]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffdd565b40 a2=0 a3=0 items=0 ppid=1973 pid=2084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.151000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 27 04:46:41.152000 audit[2086]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2086 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.152000 audit[2086]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc9fab010 a2=0 a3=0 items=0 ppid=1973 pid=2086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.152000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 27 04:46:41.154000 audit[2088]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.154000 audit[2088]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff5e3d640 a2=0 a3=0 items=0 ppid=1973 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.154000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 27 04:46:41.156000 audit[2090]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.156000 audit[2090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd56d6910 a2=0 a3=0 items=0 ppid=1973 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.156000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 27 04:46:41.158000 audit[2092]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.158000 audit[2092]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffd45237b0 a2=0 a3=0 items=0 ppid=1973 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.158000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 04:46:41.160000 audit[2094]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.160000 audit[2094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc01e8670 a2=0 a3=0 items=0 ppid=1973 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.160000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 27 04:46:41.162000 audit[2096]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.162000 audit[2096]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffdca1b660 a2=0 a3=0 items=0 ppid=1973 pid=2096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.162000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 27 04:46:41.164000 audit[2098]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2098 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.164000 audit[2098]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffd688cbf0 a2=0 a3=0 items=0 ppid=1973 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.164000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 27 04:46:41.166000 audit[2100]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.166000 audit[2100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff80c4680 a2=0 a3=0 items=0 ppid=1973 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.166000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 27 04:46:41.168000 audit[2102]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.168000 audit[2102]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=fffff4c2bbe0 a2=0 a3=0 items=0 ppid=1973 pid=2102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.168000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 27 04:46:41.170000 audit[2104]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.170000 audit[2104]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=fffff08360c0 a2=0 a3=0 items=0 ppid=1973 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.170000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 27 04:46:41.172000 audit[2106]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.172000 audit[2106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffc2ee48f0 a2=0 a3=0 items=0 ppid=1973 pid=2106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.172000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 27 04:46:41.177000 audit[2111]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.177000 audit[2111]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc82cf6c0 a2=0 a3=0 items=0 ppid=1973 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.177000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 27 04:46:41.179000 audit[2113]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.179000 audit[2113]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffd487b390 a2=0 a3=0 items=0 ppid=1973 pid=2113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.179000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 27 04:46:41.181000 audit[2115]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.181000 audit[2115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffc6170d20 a2=0 a3=0 items=0 ppid=1973 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.181000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 27 04:46:41.183000 audit[2117]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2117 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.183000 audit[2117]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff76eb6d0 a2=0 a3=0 items=0 ppid=1973 pid=2117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.183000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 27 04:46:41.184000 audit[2119]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2119 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.184000 audit[2119]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffc74a8f20 a2=0 a3=0 items=0 ppid=1973 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.184000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 27 04:46:41.186000 audit[2121]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:41.186000 audit[2121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=fffff9de1be0 a2=0 a3=0 items=0 ppid=1973 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.186000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 27 04:46:41.209000 audit[2125]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2125 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.209000 audit[2125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffc05fd970 a2=0 a3=0 items=0 ppid=1973 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.209000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 27 04:46:41.211000 audit[2127]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.211000 audit[2127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffd7a17240 a2=0 a3=0 items=0 ppid=1973 pid=2127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.211000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 27 04:46:41.219000 audit[2135]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2135 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.219000 audit[2135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=fffffc5c8650 a2=0 a3=0 items=0 ppid=1973 pid=2135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.219000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 27 04:46:41.234000 audit[2141]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.234000 audit[2141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffcd7a9b00 a2=0 a3=0 items=0 ppid=1973 pid=2141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.234000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 27 04:46:41.236000 audit[2143]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.236000 audit[2143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffde31ff40 a2=0 a3=0 items=0 ppid=1973 pid=2143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.236000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 27 04:46:41.238000 audit[2145]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.238000 audit[2145]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffcd7a60c0 a2=0 a3=0 items=0 ppid=1973 pid=2145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.238000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 27 04:46:41.240000 audit[2147]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.240000 audit[2147]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=fffff1c18710 a2=0 a3=0 items=0 ppid=1973 pid=2147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.240000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 27 04:46:41.242000 audit[2149]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:41.242000 audit[2149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffc9d95700 a2=0 a3=0 items=0 ppid=1973 pid=2149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:41.242000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 27 04:46:41.243171 systemd-networkd[1495]: docker0: Link UP Jan 27 04:46:41.250625 dockerd[1973]: time="2026-01-27T04:46:41.250573040Z" level=info msg="Loading containers: done." Jan 27 04:46:41.262816 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2870653242-merged.mount: Deactivated successfully. Jan 27 04:46:41.280844 dockerd[1973]: time="2026-01-27T04:46:41.280786520Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 27 04:46:41.281009 dockerd[1973]: time="2026-01-27T04:46:41.280876480Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 27 04:46:41.281138 dockerd[1973]: time="2026-01-27T04:46:41.281113920Z" level=info msg="Initializing buildkit" Jan 27 04:46:41.322224 dockerd[1973]: time="2026-01-27T04:46:41.322180200Z" level=info msg="Completed buildkit initialization" Jan 27 04:46:41.328723 dockerd[1973]: time="2026-01-27T04:46:41.328678440Z" level=info msg="Daemon has completed initialization" Jan 27 04:46:41.329000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:41.329303 dockerd[1973]: time="2026-01-27T04:46:41.328748400Z" level=info msg="API listen on /run/docker.sock" Jan 27 04:46:41.328997 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 27 04:46:42.703115 containerd[1666]: time="2026-01-27T04:46:42.702829280Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 27 04:46:43.255757 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 27 04:46:43.258394 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 04:46:43.457340 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 04:46:43.457000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:43.478465 (kubelet)[2199]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 04:46:43.520121 kubelet[2199]: E0127 04:46:43.519905 2199 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 04:46:43.522599 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 04:46:43.522735 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 04:46:43.522000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 04:46:43.523099 systemd[1]: kubelet.service: Consumed 145ms CPU time, 105.5M memory peak. Jan 27 04:46:43.597601 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2445143095.mount: Deactivated successfully. Jan 27 04:46:44.021540 chronyd[1636]: Selected source PHC0 Jan 27 04:46:44.196134 containerd[1666]: time="2026-01-27T04:46:44.196061240Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:44.198669 containerd[1666]: time="2026-01-27T04:46:44.198438645Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=24846206" Jan 27 04:46:44.199608 containerd[1666]: time="2026-01-27T04:46:44.199579424Z" level=info msg="ImageCreate event name:\"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:44.202101 containerd[1666]: time="2026-01-27T04:46:44.202058214Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:44.203116 containerd[1666]: time="2026-01-27T04:46:44.203077222Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"26438581\" in 1.500208099s" Jan 27 04:46:44.203171 containerd[1666]: time="2026-01-27T04:46:44.203124853Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\"" Jan 27 04:46:44.203655 containerd[1666]: time="2026-01-27T04:46:44.203635218Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 27 04:46:45.245799 containerd[1666]: time="2026-01-27T04:46:45.245737122Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:45.247346 containerd[1666]: time="2026-01-27T04:46:45.247294290Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=22613932" Jan 27 04:46:45.248786 containerd[1666]: time="2026-01-27T04:46:45.248750294Z" level=info msg="ImageCreate event name:\"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:45.251725 containerd[1666]: time="2026-01-27T04:46:45.251690352Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:45.252546 containerd[1666]: time="2026-01-27T04:46:45.252509526Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"24206567\" in 1.048841784s" Jan 27 04:46:45.252582 containerd[1666]: time="2026-01-27T04:46:45.252544213Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\"" Jan 27 04:46:45.253136 containerd[1666]: time="2026-01-27T04:46:45.253113099Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 27 04:46:46.311769 containerd[1666]: time="2026-01-27T04:46:46.311726564Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:46.313530 containerd[1666]: time="2026-01-27T04:46:46.313489506Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=17608611" Jan 27 04:46:46.314702 containerd[1666]: time="2026-01-27T04:46:46.314676962Z" level=info msg="ImageCreate event name:\"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:46.318280 containerd[1666]: time="2026-01-27T04:46:46.318250660Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:46.319729 containerd[1666]: time="2026-01-27T04:46:46.319691484Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"19201246\" in 1.066547328s" Jan 27 04:46:46.319772 containerd[1666]: time="2026-01-27T04:46:46.319728921Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\"" Jan 27 04:46:46.320566 containerd[1666]: time="2026-01-27T04:46:46.320387714Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 27 04:46:47.182977 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount135421373.mount: Deactivated successfully. Jan 27 04:46:47.417422 containerd[1666]: time="2026-01-27T04:46:47.417351151Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:47.419634 containerd[1666]: time="2026-01-27T04:46:47.419280488Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=17713718" Jan 27 04:46:47.422879 containerd[1666]: time="2026-01-27T04:46:47.422845532Z" level=info msg="ImageCreate event name:\"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:47.424778 containerd[1666]: time="2026-01-27T04:46:47.424751512Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:47.425505 containerd[1666]: time="2026-01-27T04:46:47.425477333Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"27557743\" in 1.105060871s" Jan 27 04:46:47.425558 containerd[1666]: time="2026-01-27T04:46:47.425506263Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\"" Jan 27 04:46:47.426058 containerd[1666]: time="2026-01-27T04:46:47.426037932Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 27 04:46:48.079938 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2444480201.mount: Deactivated successfully. Jan 27 04:46:48.707159 containerd[1666]: time="2026-01-27T04:46:48.706960954Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:48.710906 containerd[1666]: time="2026-01-27T04:46:48.710815813Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=0" Jan 27 04:46:48.716734 containerd[1666]: time="2026-01-27T04:46:48.716636603Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:48.721769 containerd[1666]: time="2026-01-27T04:46:48.721735826Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:48.722595 containerd[1666]: time="2026-01-27T04:46:48.722567503Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.296402851s" Jan 27 04:46:48.722701 containerd[1666]: time="2026-01-27T04:46:48.722687074Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Jan 27 04:46:48.723548 containerd[1666]: time="2026-01-27T04:46:48.723509732Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 27 04:46:49.243607 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1157001474.mount: Deactivated successfully. Jan 27 04:46:49.256122 containerd[1666]: time="2026-01-27T04:46:49.255739926Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 04:46:49.256739 containerd[1666]: time="2026-01-27T04:46:49.256673263Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 27 04:46:49.260534 containerd[1666]: time="2026-01-27T04:46:49.260412169Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 04:46:49.265526 containerd[1666]: time="2026-01-27T04:46:49.265449380Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 27 04:46:49.266114 containerd[1666]: time="2026-01-27T04:46:49.265971349Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 542.428297ms" Jan 27 04:46:49.266114 containerd[1666]: time="2026-01-27T04:46:49.266007709Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jan 27 04:46:49.266597 containerd[1666]: time="2026-01-27T04:46:49.266420757Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 27 04:46:49.957152 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1105506443.mount: Deactivated successfully. Jan 27 04:46:51.459125 containerd[1666]: time="2026-01-27T04:46:51.458846585Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:51.464980 containerd[1666]: time="2026-01-27T04:46:51.464898216Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=66060366" Jan 27 04:46:51.467429 containerd[1666]: time="2026-01-27T04:46:51.467366948Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:51.472558 containerd[1666]: time="2026-01-27T04:46:51.472443894Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:46:51.473759 containerd[1666]: time="2026-01-27T04:46:51.473641060Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.207187383s" Jan 27 04:46:51.473759 containerd[1666]: time="2026-01-27T04:46:51.473673581Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Jan 27 04:46:53.690493 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 27 04:46:53.691881 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 04:46:53.847074 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 04:46:53.846000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:53.848154 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 27 04:46:53.848202 kernel: audit: type=1130 audit(1769489213.846:303): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:53.851873 (kubelet)[2422]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 27 04:46:53.889337 kubelet[2422]: E0127 04:46:53.889262 2422 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 27 04:46:53.891817 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 27 04:46:53.891949 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 27 04:46:53.892347 systemd[1]: kubelet.service: Consumed 144ms CPU time, 107.2M memory peak. Jan 27 04:46:53.891000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 04:46:53.895125 kernel: audit: type=1131 audit(1769489213.891:304): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 04:46:56.852069 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 04:46:56.852245 systemd[1]: kubelet.service: Consumed 144ms CPU time, 107.2M memory peak. Jan 27 04:46:56.851000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:56.851000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:56.854540 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 04:46:56.857181 kernel: audit: type=1130 audit(1769489216.851:305): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:56.857242 kernel: audit: type=1131 audit(1769489216.851:306): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:56.881502 systemd[1]: Reload requested from client PID 2437 ('systemctl') (unit session-10.scope)... Jan 27 04:46:56.881520 systemd[1]: Reloading... Jan 27 04:46:56.964125 zram_generator::config[2483]: No configuration found. Jan 27 04:46:57.126498 systemd[1]: Reloading finished in 244 ms. Jan 27 04:46:57.150000 audit: BPF prog-id=66 op=LOAD Jan 27 04:46:57.155137 kernel: audit: type=1334 audit(1769489217.150:307): prog-id=66 op=LOAD Jan 27 04:46:57.155205 kernel: audit: type=1334 audit(1769489217.150:308): prog-id=52 op=UNLOAD Jan 27 04:46:57.150000 audit: BPF prog-id=52 op=UNLOAD Jan 27 04:46:57.153000 audit: BPF prog-id=67 op=LOAD Jan 27 04:46:57.156103 kernel: audit: type=1334 audit(1769489217.153:309): prog-id=67 op=LOAD Jan 27 04:46:57.162000 audit: BPF prog-id=56 op=UNLOAD Jan 27 04:46:57.165000 audit: BPF prog-id=68 op=LOAD Jan 27 04:46:57.167594 kernel: audit: type=1334 audit(1769489217.162:310): prog-id=56 op=UNLOAD Jan 27 04:46:57.167641 kernel: audit: type=1334 audit(1769489217.165:311): prog-id=68 op=LOAD Jan 27 04:46:57.165000 audit: BPF prog-id=69 op=LOAD Jan 27 04:46:57.168428 kernel: audit: type=1334 audit(1769489217.165:312): prog-id=69 op=LOAD Jan 27 04:46:57.165000 audit: BPF prog-id=57 op=UNLOAD Jan 27 04:46:57.165000 audit: BPF prog-id=58 op=UNLOAD Jan 27 04:46:57.167000 audit: BPF prog-id=70 op=LOAD Jan 27 04:46:57.167000 audit: BPF prog-id=71 op=LOAD Jan 27 04:46:57.167000 audit: BPF prog-id=54 op=UNLOAD Jan 27 04:46:57.167000 audit: BPF prog-id=55 op=UNLOAD Jan 27 04:46:57.167000 audit: BPF prog-id=72 op=LOAD Jan 27 04:46:57.167000 audit: BPF prog-id=59 op=UNLOAD Jan 27 04:46:57.167000 audit: BPF prog-id=73 op=LOAD Jan 27 04:46:57.167000 audit: BPF prog-id=74 op=LOAD Jan 27 04:46:57.167000 audit: BPF prog-id=60 op=UNLOAD Jan 27 04:46:57.167000 audit: BPF prog-id=61 op=UNLOAD Jan 27 04:46:57.168000 audit: BPF prog-id=75 op=LOAD Jan 27 04:46:57.168000 audit: BPF prog-id=49 op=UNLOAD Jan 27 04:46:57.168000 audit: BPF prog-id=76 op=LOAD Jan 27 04:46:57.168000 audit: BPF prog-id=77 op=LOAD Jan 27 04:46:57.168000 audit: BPF prog-id=50 op=UNLOAD Jan 27 04:46:57.168000 audit: BPF prog-id=51 op=UNLOAD Jan 27 04:46:57.169000 audit: BPF prog-id=78 op=LOAD Jan 27 04:46:57.169000 audit: BPF prog-id=63 op=UNLOAD Jan 27 04:46:57.169000 audit: BPF prog-id=79 op=LOAD Jan 27 04:46:57.169000 audit: BPF prog-id=80 op=LOAD Jan 27 04:46:57.169000 audit: BPF prog-id=64 op=UNLOAD Jan 27 04:46:57.169000 audit: BPF prog-id=65 op=UNLOAD Jan 27 04:46:57.170000 audit: BPF prog-id=81 op=LOAD Jan 27 04:46:57.170000 audit: BPF prog-id=53 op=UNLOAD Jan 27 04:46:57.171000 audit: BPF prog-id=82 op=LOAD Jan 27 04:46:57.171000 audit: BPF prog-id=46 op=UNLOAD Jan 27 04:46:57.171000 audit: BPF prog-id=83 op=LOAD Jan 27 04:46:57.171000 audit: BPF prog-id=84 op=LOAD Jan 27 04:46:57.171000 audit: BPF prog-id=47 op=UNLOAD Jan 27 04:46:57.171000 audit: BPF prog-id=48 op=UNLOAD Jan 27 04:46:57.172000 audit: BPF prog-id=85 op=LOAD Jan 27 04:46:57.172000 audit: BPF prog-id=62 op=UNLOAD Jan 27 04:46:57.189054 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 27 04:46:57.189144 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 27 04:46:57.189449 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 04:46:57.188000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 27 04:46:57.189507 systemd[1]: kubelet.service: Consumed 90ms CPU time, 95.1M memory peak. Jan 27 04:46:57.191157 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 04:46:57.307054 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 04:46:57.306000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:46:57.324691 (kubelet)[2531]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 27 04:46:57.359208 kubelet[2531]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 04:46:57.359208 kubelet[2531]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 27 04:46:57.359208 kubelet[2531]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 04:46:57.359541 kubelet[2531]: I0127 04:46:57.359235 2531 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 04:46:58.824705 kubelet[2531]: I0127 04:46:58.824308 2531 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 27 04:46:58.824705 kubelet[2531]: I0127 04:46:58.824375 2531 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 04:46:58.825067 kubelet[2531]: I0127 04:46:58.824831 2531 server.go:954] "Client rotation is on, will bootstrap in background" Jan 27 04:46:58.874639 kubelet[2531]: E0127 04:46:58.874577 2531 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.3.32:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.3.32:6443: connect: connection refused" logger="UnhandledError" Jan 27 04:46:58.881603 kubelet[2531]: I0127 04:46:58.881549 2531 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 27 04:46:58.890021 kubelet[2531]: I0127 04:46:58.889999 2531 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 04:46:58.892630 kubelet[2531]: I0127 04:46:58.892599 2531 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 27 04:46:58.893586 kubelet[2531]: I0127 04:46:58.893529 2531 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 04:46:58.893740 kubelet[2531]: I0127 04:46:58.893570 2531 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4592-0-0-n-c2731c5fad","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 04:46:58.893830 kubelet[2531]: I0127 04:46:58.893812 2531 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 04:46:58.893830 kubelet[2531]: I0127 04:46:58.893822 2531 container_manager_linux.go:304] "Creating device plugin manager" Jan 27 04:46:58.894034 kubelet[2531]: I0127 04:46:58.894018 2531 state_mem.go:36] "Initialized new in-memory state store" Jan 27 04:46:58.898648 kubelet[2531]: I0127 04:46:58.898616 2531 kubelet.go:446] "Attempting to sync node with API server" Jan 27 04:46:58.898648 kubelet[2531]: I0127 04:46:58.898639 2531 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 04:46:58.898826 kubelet[2531]: I0127 04:46:58.898660 2531 kubelet.go:352] "Adding apiserver pod source" Jan 27 04:46:58.898826 kubelet[2531]: I0127 04:46:58.898671 2531 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 04:46:58.907978 kubelet[2531]: W0127 04:46:58.907913 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.3.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.3.32:6443: connect: connection refused Jan 27 04:46:58.907978 kubelet[2531]: E0127 04:46:58.907979 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.3.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.3.32:6443: connect: connection refused" logger="UnhandledError" Jan 27 04:46:58.908320 kubelet[2531]: I0127 04:46:58.908082 2531 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 27 04:46:58.908949 kubelet[2531]: I0127 04:46:58.908892 2531 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 27 04:46:58.909065 kubelet[2531]: W0127 04:46:58.909047 2531 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 27 04:46:58.911245 kubelet[2531]: W0127 04:46:58.911205 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.3.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4592-0-0-n-c2731c5fad&limit=500&resourceVersion=0": dial tcp 10.0.3.32:6443: connect: connection refused Jan 27 04:46:58.911374 kubelet[2531]: E0127 04:46:58.911351 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.3.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4592-0-0-n-c2731c5fad&limit=500&resourceVersion=0\": dial tcp 10.0.3.32:6443: connect: connection refused" logger="UnhandledError" Jan 27 04:46:58.911837 kubelet[2531]: I0127 04:46:58.911818 2531 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 27 04:46:58.911919 kubelet[2531]: I0127 04:46:58.911910 2531 server.go:1287] "Started kubelet" Jan 27 04:46:58.913964 kubelet[2531]: I0127 04:46:58.913914 2531 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 04:46:58.914181 kubelet[2531]: I0127 04:46:58.914125 2531 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 04:46:58.915502 kubelet[2531]: I0127 04:46:58.915139 2531 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 04:46:58.916174 kubelet[2531]: I0127 04:46:58.916152 2531 server.go:479] "Adding debug handlers to kubelet server" Jan 27 04:46:58.916870 kubelet[2531]: I0127 04:46:58.916841 2531 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 04:46:58.917372 kubelet[2531]: I0127 04:46:58.917324 2531 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 27 04:46:58.917501 kubelet[2531]: I0127 04:46:58.917454 2531 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 27 04:46:58.919250 kubelet[2531]: I0127 04:46:58.919224 2531 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 27 04:46:58.919317 kubelet[2531]: I0127 04:46:58.919281 2531 reconciler.go:26] "Reconciler: start to sync state" Jan 27 04:46:58.919432 kubelet[2531]: E0127 04:46:58.919410 2531 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" Jan 27 04:46:58.919495 kubelet[2531]: E0127 04:46:58.919463 2531 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4592-0-0-n-c2731c5fad?timeout=10s\": dial tcp 10.0.3.32:6443: connect: connection refused" interval="200ms" Jan 27 04:46:58.920256 kubelet[2531]: I0127 04:46:58.920208 2531 factory.go:221] Registration of the systemd container factory successfully Jan 27 04:46:58.922073 kubelet[2531]: I0127 04:46:58.921970 2531 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 27 04:46:58.924625 kernel: kauditd_printk_skb: 36 callbacks suppressed Jan 27 04:46:58.924686 kernel: audit: type=1325 audit(1769489218.921:349): table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2544 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.921000 audit[2544]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2544 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.924765 kubelet[2531]: W0127 04:46:58.923897 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.3.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.3.32:6443: connect: connection refused Jan 27 04:46:58.924765 kubelet[2531]: E0127 04:46:58.923943 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.3.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.3.32:6443: connect: connection refused" logger="UnhandledError" Jan 27 04:46:58.921000 audit[2544]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc9bd44c0 a2=0 a3=0 items=0 ppid=2531 pid=2544 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.926756 kubelet[2531]: I0127 04:46:58.926640 2531 factory.go:221] Registration of the containerd container factory successfully Jan 27 04:46:58.926756 kubelet[2531]: E0127 04:46:58.926699 2531 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 27 04:46:58.928211 kernel: audit: type=1300 audit(1769489218.921:349): arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc9bd44c0 a2=0 a3=0 items=0 ppid=2531 pid=2544 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.921000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 27 04:46:58.931945 kernel: audit: type=1327 audit(1769489218.921:349): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 27 04:46:58.932001 kernel: audit: type=1325 audit(1769489218.923:350): table=filter:43 family=2 entries=1 op=nft_register_chain pid=2545 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.923000 audit[2545]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2545 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.933856 kernel: audit: type=1300 audit(1769489218.923:350): arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffb52d8c0 a2=0 a3=0 items=0 ppid=2531 pid=2545 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.923000 audit[2545]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffb52d8c0 a2=0 a3=0 items=0 ppid=2531 pid=2545 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.937159 kernel: audit: type=1327 audit(1769489218.923:350): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 27 04:46:58.923000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 27 04:46:58.937268 kubelet[2531]: E0127 04:46:58.935849 2531 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.3.32:6443/api/v1/namespaces/default/events\": dial tcp 10.0.3.32:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4592-0-0-n-c2731c5fad.188e7d11c7206f01 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4592-0-0-n-c2731c5fad,UID:ci-4592-0-0-n-c2731c5fad,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4592-0-0-n-c2731c5fad,},FirstTimestamp:2026-01-27 04:46:58.911891201 +0000 UTC m=+1.584252484,LastTimestamp:2026-01-27 04:46:58.911891201 +0000 UTC m=+1.584252484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4592-0-0-n-c2731c5fad,}" Jan 27 04:46:58.937268 kubelet[2531]: I0127 04:46:58.936566 2531 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 27 04:46:58.937268 kubelet[2531]: I0127 04:46:58.936585 2531 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 27 04:46:58.937268 kubelet[2531]: I0127 04:46:58.936602 2531 state_mem.go:36] "Initialized new in-memory state store" Jan 27 04:46:58.937831 kernel: audit: type=1325 audit(1769489218.926:351): table=filter:44 family=2 entries=2 op=nft_register_chain pid=2547 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.926000 audit[2547]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2547 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.939516 kernel: audit: type=1300 audit(1769489218.926:351): arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffdf22bcc0 a2=0 a3=0 items=0 ppid=2531 pid=2547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.926000 audit[2547]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffdf22bcc0 a2=0 a3=0 items=0 ppid=2531 pid=2547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.942069 kubelet[2531]: I0127 04:46:58.941994 2531 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 27 04:46:58.943219 kubelet[2531]: I0127 04:46:58.943196 2531 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 27 04:46:58.943219 kubelet[2531]: I0127 04:46:58.943223 2531 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 27 04:46:58.943308 kubelet[2531]: I0127 04:46:58.943243 2531 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 27 04:46:58.943308 kubelet[2531]: I0127 04:46:58.943250 2531 kubelet.go:2382] "Starting kubelet main sync loop" Jan 27 04:46:58.943308 kubelet[2531]: E0127 04:46:58.943290 2531 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 04:46:58.944494 kubelet[2531]: W0127 04:46:58.944452 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.3.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.3.32:6443: connect: connection refused Jan 27 04:46:58.944683 kubelet[2531]: E0127 04:46:58.944628 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.3.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.3.32:6443: connect: connection refused" logger="UnhandledError" Jan 27 04:46:58.926000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 04:46:58.945933 kubelet[2531]: I0127 04:46:58.945588 2531 policy_none.go:49] "None policy: Start" Jan 27 04:46:58.945933 kubelet[2531]: I0127 04:46:58.945617 2531 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 27 04:46:58.945933 kubelet[2531]: I0127 04:46:58.945629 2531 state_mem.go:35] "Initializing new in-memory state store" Jan 27 04:46:58.947006 kernel: audit: type=1327 audit(1769489218.926:351): proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 04:46:58.947219 kernel: audit: type=1325 audit(1769489218.928:352): table=filter:45 family=2 entries=2 op=nft_register_chain pid=2549 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.928000 audit[2549]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2549 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.928000 audit[2549]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc3805bd0 a2=0 a3=0 items=0 ppid=2531 pid=2549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.928000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 04:46:58.940000 audit[2555]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2555 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.940000 audit[2555]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffeef466c0 a2=0 a3=0 items=0 ppid=2531 pid=2555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.940000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 27 04:46:58.941000 audit[2556]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2556 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:58.941000 audit[2556]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffd5963140 a2=0 a3=0 items=0 ppid=2531 pid=2556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.941000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 27 04:46:58.941000 audit[2557]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2557 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.941000 audit[2557]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc5430a50 a2=0 a3=0 items=0 ppid=2531 pid=2557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.941000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 27 04:46:58.942000 audit[2559]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2559 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:58.942000 audit[2559]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff3edcef0 a2=0 a3=0 items=0 ppid=2531 pid=2559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.942000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 27 04:46:58.943000 audit[2561]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2561 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:58.943000 audit[2561]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff15759f0 a2=0 a3=0 items=0 ppid=2531 pid=2561 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.943000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 27 04:46:58.943000 audit[2560]: NETFILTER_CFG table=nat:51 family=2 entries=1 op=nft_register_chain pid=2560 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.943000 audit[2560]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdbcaaab0 a2=0 a3=0 items=0 ppid=2531 pid=2560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.943000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 27 04:46:58.945000 audit[2563]: NETFILTER_CFG table=filter:52 family=10 entries=1 op=nft_register_chain pid=2563 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:46:58.945000 audit[2563]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd81038c0 a2=0 a3=0 items=0 ppid=2531 pid=2563 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.945000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 27 04:46:58.947000 audit[2564]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2564 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:46:58.947000 audit[2564]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffef1fb200 a2=0 a3=0 items=0 ppid=2531 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:58.947000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 27 04:46:58.953441 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 27 04:46:58.967794 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 27 04:46:58.970747 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 27 04:46:58.982341 kubelet[2531]: I0127 04:46:58.982311 2531 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 27 04:46:58.982775 kubelet[2531]: I0127 04:46:58.982719 2531 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 04:46:58.982775 kubelet[2531]: I0127 04:46:58.982734 2531 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 04:46:58.983349 kubelet[2531]: I0127 04:46:58.983111 2531 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 04:46:58.984324 kubelet[2531]: E0127 04:46:58.984259 2531 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 27 04:46:58.984324 kubelet[2531]: E0127 04:46:58.984314 2531 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4592-0-0-n-c2731c5fad\" not found" Jan 27 04:46:59.054246 systemd[1]: Created slice kubepods-burstable-pod9f0a1cead54afc0be00aa20c32026bef.slice - libcontainer container kubepods-burstable-pod9f0a1cead54afc0be00aa20c32026bef.slice. Jan 27 04:46:59.068397 kubelet[2531]: E0127 04:46:59.068323 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.068822 systemd[1]: Created slice kubepods-burstable-podfed1eea61a1c438004a60752c1c3b064.slice - libcontainer container kubepods-burstable-podfed1eea61a1c438004a60752c1c3b064.slice. Jan 27 04:46:59.070681 kubelet[2531]: E0127 04:46:59.070658 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.072114 systemd[1]: Created slice kubepods-burstable-pod4d3e8bbb89caa8d20e7e50b2f3a936f9.slice - libcontainer container kubepods-burstable-pod4d3e8bbb89caa8d20e7e50b2f3a936f9.slice. Jan 27 04:46:59.073364 kubelet[2531]: E0127 04:46:59.073337 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.085957 kubelet[2531]: I0127 04:46:59.085838 2531 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.086529 kubelet[2531]: E0127 04:46:59.086489 2531 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.32:6443/api/v1/nodes\": dial tcp 10.0.3.32:6443: connect: connection refused" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.120334 kubelet[2531]: E0127 04:46:59.120259 2531 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4592-0-0-n-c2731c5fad?timeout=10s\": dial tcp 10.0.3.32:6443: connect: connection refused" interval="400ms" Jan 27 04:46:59.120334 kubelet[2531]: I0127 04:46:59.120318 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-ca-certs\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.120762 kubelet[2531]: I0127 04:46:59.120393 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-k8s-certs\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.120762 kubelet[2531]: I0127 04:46:59.120455 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-kubeconfig\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.120762 kubelet[2531]: I0127 04:46:59.120502 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4d3e8bbb89caa8d20e7e50b2f3a936f9-kubeconfig\") pod \"kube-scheduler-ci-4592-0-0-n-c2731c5fad\" (UID: \"4d3e8bbb89caa8d20e7e50b2f3a936f9\") " pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.120762 kubelet[2531]: I0127 04:46:59.120542 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9f0a1cead54afc0be00aa20c32026bef-ca-certs\") pod \"kube-apiserver-ci-4592-0-0-n-c2731c5fad\" (UID: \"9f0a1cead54afc0be00aa20c32026bef\") " pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.120762 kubelet[2531]: I0127 04:46:59.120555 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9f0a1cead54afc0be00aa20c32026bef-k8s-certs\") pod \"kube-apiserver-ci-4592-0-0-n-c2731c5fad\" (UID: \"9f0a1cead54afc0be00aa20c32026bef\") " pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.120868 kubelet[2531]: I0127 04:46:59.120570 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.120868 kubelet[2531]: I0127 04:46:59.120587 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9f0a1cead54afc0be00aa20c32026bef-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4592-0-0-n-c2731c5fad\" (UID: \"9f0a1cead54afc0be00aa20c32026bef\") " pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.120868 kubelet[2531]: I0127 04:46:59.120636 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-flexvolume-dir\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.288935 kubelet[2531]: I0127 04:46:59.288879 2531 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.289490 kubelet[2531]: E0127 04:46:59.289430 2531 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.32:6443/api/v1/nodes\": dial tcp 10.0.3.32:6443: connect: connection refused" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.371508 containerd[1666]: time="2026-01-27T04:46:59.371288545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4592-0-0-n-c2731c5fad,Uid:9f0a1cead54afc0be00aa20c32026bef,Namespace:kube-system,Attempt:0,}" Jan 27 04:46:59.371508 containerd[1666]: time="2026-01-27T04:46:59.371418225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4592-0-0-n-c2731c5fad,Uid:fed1eea61a1c438004a60752c1c3b064,Namespace:kube-system,Attempt:0,}" Jan 27 04:46:59.374057 containerd[1666]: time="2026-01-27T04:46:59.374000638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4592-0-0-n-c2731c5fad,Uid:4d3e8bbb89caa8d20e7e50b2f3a936f9,Namespace:kube-system,Attempt:0,}" Jan 27 04:46:59.436973 containerd[1666]: time="2026-01-27T04:46:59.436916599Z" level=info msg="connecting to shim 17981b37097758f25b3ef9ebec0f103ef2cee1c4897b5070e36c4704c3ffa990" address="unix:///run/containerd/s/db7da47802582d4315e67eed57737de56cae8d9006a5312d913a78b01e203405" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:46:59.440271 containerd[1666]: time="2026-01-27T04:46:59.440221936Z" level=info msg="connecting to shim 1acf1486575e86539efe0574e057fe5b875e35a867014752d29000ebecb77924" address="unix:///run/containerd/s/21c2f934621c4351eeaeac0c13b52aaa9db3181146d7cc55b04d404ddb3390bf" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:46:59.448301 containerd[1666]: time="2026-01-27T04:46:59.448260657Z" level=info msg="connecting to shim a5054b363b167c4dcd17a989241103c1998aa7752a8c543188088b6a45394437" address="unix:///run/containerd/s/6a8e1897254a49e2cdc3dc150020478f7d681d6af7ea52f06624d749ef58ad7f" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:46:59.466512 systemd[1]: Started cri-containerd-17981b37097758f25b3ef9ebec0f103ef2cee1c4897b5070e36c4704c3ffa990.scope - libcontainer container 17981b37097758f25b3ef9ebec0f103ef2cee1c4897b5070e36c4704c3ffa990. Jan 27 04:46:59.470423 systemd[1]: Started cri-containerd-a5054b363b167c4dcd17a989241103c1998aa7752a8c543188088b6a45394437.scope - libcontainer container a5054b363b167c4dcd17a989241103c1998aa7752a8c543188088b6a45394437. Jan 27 04:46:59.474676 systemd[1]: Started cri-containerd-1acf1486575e86539efe0574e057fe5b875e35a867014752d29000ebecb77924.scope - libcontainer container 1acf1486575e86539efe0574e057fe5b875e35a867014752d29000ebecb77924. Jan 27 04:46:59.480000 audit: BPF prog-id=86 op=LOAD Jan 27 04:46:59.481000 audit: BPF prog-id=87 op=LOAD Jan 27 04:46:59.481000 audit[2609]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2579 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137393831623337303937373538663235623365663965626563306631 Jan 27 04:46:59.481000 audit: BPF prog-id=87 op=UNLOAD Jan 27 04:46:59.481000 audit[2609]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2579 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137393831623337303937373538663235623365663965626563306631 Jan 27 04:46:59.481000 audit: BPF prog-id=88 op=LOAD Jan 27 04:46:59.481000 audit[2609]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2579 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137393831623337303937373538663235623365663965626563306631 Jan 27 04:46:59.481000 audit: BPF prog-id=89 op=LOAD Jan 27 04:46:59.481000 audit[2609]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2579 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137393831623337303937373538663235623365663965626563306631 Jan 27 04:46:59.481000 audit: BPF prog-id=89 op=UNLOAD Jan 27 04:46:59.481000 audit[2609]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2579 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137393831623337303937373538663235623365663965626563306631 Jan 27 04:46:59.481000 audit: BPF prog-id=88 op=UNLOAD Jan 27 04:46:59.481000 audit[2609]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2579 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137393831623337303937373538663235623365663965626563306631 Jan 27 04:46:59.481000 audit: BPF prog-id=90 op=LOAD Jan 27 04:46:59.481000 audit[2609]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2579 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137393831623337303937373538663235623365663965626563306631 Jan 27 04:46:59.483000 audit: BPF prog-id=91 op=LOAD Jan 27 04:46:59.483000 audit: BPF prog-id=92 op=LOAD Jan 27 04:46:59.483000 audit[2630]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2613 pid=2630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135303534623336336231363763346463643137613938393234313130 Jan 27 04:46:59.483000 audit: BPF prog-id=92 op=UNLOAD Jan 27 04:46:59.483000 audit[2630]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135303534623336336231363763346463643137613938393234313130 Jan 27 04:46:59.483000 audit: BPF prog-id=93 op=LOAD Jan 27 04:46:59.483000 audit[2630]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2613 pid=2630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135303534623336336231363763346463643137613938393234313130 Jan 27 04:46:59.483000 audit: BPF prog-id=94 op=LOAD Jan 27 04:46:59.483000 audit[2630]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2613 pid=2630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135303534623336336231363763346463643137613938393234313130 Jan 27 04:46:59.484000 audit: BPF prog-id=94 op=UNLOAD Jan 27 04:46:59.484000 audit[2630]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135303534623336336231363763346463643137613938393234313130 Jan 27 04:46:59.484000 audit: BPF prog-id=93 op=UNLOAD Jan 27 04:46:59.484000 audit[2630]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135303534623336336231363763346463643137613938393234313130 Jan 27 04:46:59.484000 audit: BPF prog-id=95 op=LOAD Jan 27 04:46:59.484000 audit[2630]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2613 pid=2630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135303534623336336231363763346463643137613938393234313130 Jan 27 04:46:59.486000 audit: BPF prog-id=96 op=LOAD Jan 27 04:46:59.487000 audit: BPF prog-id=97 op=LOAD Jan 27 04:46:59.487000 audit[2640]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2582 pid=2640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.487000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161636631343836353735653836353339656665303537346530353766 Jan 27 04:46:59.487000 audit: BPF prog-id=97 op=UNLOAD Jan 27 04:46:59.487000 audit[2640]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.487000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161636631343836353735653836353339656665303537346530353766 Jan 27 04:46:59.489000 audit: BPF prog-id=98 op=LOAD Jan 27 04:46:59.489000 audit[2640]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2582 pid=2640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161636631343836353735653836353339656665303537346530353766 Jan 27 04:46:59.489000 audit: BPF prog-id=99 op=LOAD Jan 27 04:46:59.489000 audit[2640]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2582 pid=2640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161636631343836353735653836353339656665303537346530353766 Jan 27 04:46:59.489000 audit: BPF prog-id=99 op=UNLOAD Jan 27 04:46:59.489000 audit[2640]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161636631343836353735653836353339656665303537346530353766 Jan 27 04:46:59.489000 audit: BPF prog-id=98 op=UNLOAD Jan 27 04:46:59.489000 audit[2640]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161636631343836353735653836353339656665303537346530353766 Jan 27 04:46:59.489000 audit: BPF prog-id=100 op=LOAD Jan 27 04:46:59.489000 audit[2640]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2582 pid=2640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.489000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161636631343836353735653836353339656665303537346530353766 Jan 27 04:46:59.513196 containerd[1666]: time="2026-01-27T04:46:59.513156748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4592-0-0-n-c2731c5fad,Uid:4d3e8bbb89caa8d20e7e50b2f3a936f9,Namespace:kube-system,Attempt:0,} returns sandbox id \"a5054b363b167c4dcd17a989241103c1998aa7752a8c543188088b6a45394437\"" Jan 27 04:46:59.516487 containerd[1666]: time="2026-01-27T04:46:59.516448805Z" level=info msg="CreateContainer within sandbox \"a5054b363b167c4dcd17a989241103c1998aa7752a8c543188088b6a45394437\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 27 04:46:59.520340 containerd[1666]: time="2026-01-27T04:46:59.520305105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4592-0-0-n-c2731c5fad,Uid:9f0a1cead54afc0be00aa20c32026bef,Namespace:kube-system,Attempt:0,} returns sandbox id \"17981b37097758f25b3ef9ebec0f103ef2cee1c4897b5070e36c4704c3ffa990\"" Jan 27 04:46:59.520866 kubelet[2531]: E0127 04:46:59.520820 2531 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4592-0-0-n-c2731c5fad?timeout=10s\": dial tcp 10.0.3.32:6443: connect: connection refused" interval="800ms" Jan 27 04:46:59.523191 containerd[1666]: time="2026-01-27T04:46:59.523130519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4592-0-0-n-c2731c5fad,Uid:fed1eea61a1c438004a60752c1c3b064,Namespace:kube-system,Attempt:0,} returns sandbox id \"1acf1486575e86539efe0574e057fe5b875e35a867014752d29000ebecb77924\"" Jan 27 04:46:59.523423 containerd[1666]: time="2026-01-27T04:46:59.523397201Z" level=info msg="CreateContainer within sandbox \"17981b37097758f25b3ef9ebec0f103ef2cee1c4897b5070e36c4704c3ffa990\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 27 04:46:59.527717 containerd[1666]: time="2026-01-27T04:46:59.527683983Z" level=info msg="CreateContainer within sandbox \"1acf1486575e86539efe0574e057fe5b875e35a867014752d29000ebecb77924\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 27 04:46:59.534832 containerd[1666]: time="2026-01-27T04:46:59.534798099Z" level=info msg="Container 8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:46:59.548153 containerd[1666]: time="2026-01-27T04:46:59.548074847Z" level=info msg="CreateContainer within sandbox \"a5054b363b167c4dcd17a989241103c1998aa7752a8c543188088b6a45394437\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379\"" Jan 27 04:46:59.548822 containerd[1666]: time="2026-01-27T04:46:59.548794130Z" level=info msg="StartContainer for \"8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379\"" Jan 27 04:46:59.549943 containerd[1666]: time="2026-01-27T04:46:59.549919536Z" level=info msg="connecting to shim 8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379" address="unix:///run/containerd/s/6a8e1897254a49e2cdc3dc150020478f7d681d6af7ea52f06624d749ef58ad7f" protocol=ttrpc version=3 Jan 27 04:46:59.550341 containerd[1666]: time="2026-01-27T04:46:59.550292098Z" level=info msg="Container 08498c2f0d210a11d9a5e71fa62ac6005d06a02a6e4fa92ef42941bc7304d4f9: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:46:59.563473 containerd[1666]: time="2026-01-27T04:46:59.563411005Z" level=info msg="CreateContainer within sandbox \"17981b37097758f25b3ef9ebec0f103ef2cee1c4897b5070e36c4704c3ffa990\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"08498c2f0d210a11d9a5e71fa62ac6005d06a02a6e4fa92ef42941bc7304d4f9\"" Jan 27 04:46:59.563882 containerd[1666]: time="2026-01-27T04:46:59.563861207Z" level=info msg="StartContainer for \"08498c2f0d210a11d9a5e71fa62ac6005d06a02a6e4fa92ef42941bc7304d4f9\"" Jan 27 04:46:59.564813 containerd[1666]: time="2026-01-27T04:46:59.564770932Z" level=info msg="connecting to shim 08498c2f0d210a11d9a5e71fa62ac6005d06a02a6e4fa92ef42941bc7304d4f9" address="unix:///run/containerd/s/db7da47802582d4315e67eed57737de56cae8d9006a5312d913a78b01e203405" protocol=ttrpc version=3 Jan 27 04:46:59.565951 containerd[1666]: time="2026-01-27T04:46:59.565881337Z" level=info msg="Container faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:46:59.566281 systemd[1]: Started cri-containerd-8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379.scope - libcontainer container 8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379. Jan 27 04:46:59.575000 audit: BPF prog-id=101 op=LOAD Jan 27 04:46:59.576000 audit: BPF prog-id=102 op=LOAD Jan 27 04:46:59.576000 audit[2705]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2613 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.576000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862383030373430613831613762396633343961336364356132636363 Jan 27 04:46:59.576000 audit: BPF prog-id=102 op=UNLOAD Jan 27 04:46:59.576000 audit[2705]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.576000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862383030373430613831613762396633343961336364356132636363 Jan 27 04:46:59.576000 audit: BPF prog-id=103 op=LOAD Jan 27 04:46:59.576000 audit[2705]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2613 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.576000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862383030373430613831613762396633343961336364356132636363 Jan 27 04:46:59.576000 audit: BPF prog-id=104 op=LOAD Jan 27 04:46:59.576000 audit[2705]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2613 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.576000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862383030373430613831613762396633343961336364356132636363 Jan 27 04:46:59.576000 audit: BPF prog-id=104 op=UNLOAD Jan 27 04:46:59.576000 audit[2705]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.576000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862383030373430613831613762396633343961336364356132636363 Jan 27 04:46:59.576000 audit: BPF prog-id=103 op=UNLOAD Jan 27 04:46:59.576000 audit[2705]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.576000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862383030373430613831613762396633343961336364356132636363 Jan 27 04:46:59.577000 audit: BPF prog-id=105 op=LOAD Jan 27 04:46:59.577000 audit[2705]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2613 pid=2705 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.577000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862383030373430613831613762396633343961336364356132636363 Jan 27 04:46:59.580966 containerd[1666]: time="2026-01-27T04:46:59.580516372Z" level=info msg="CreateContainer within sandbox \"1acf1486575e86539efe0574e057fe5b875e35a867014752d29000ebecb77924\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8\"" Jan 27 04:46:59.581152 containerd[1666]: time="2026-01-27T04:46:59.581124695Z" level=info msg="StartContainer for \"faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8\"" Jan 27 04:46:59.582270 containerd[1666]: time="2026-01-27T04:46:59.582247781Z" level=info msg="connecting to shim faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8" address="unix:///run/containerd/s/21c2f934621c4351eeaeac0c13b52aaa9db3181146d7cc55b04d404ddb3390bf" protocol=ttrpc version=3 Jan 27 04:46:59.587278 systemd[1]: Started cri-containerd-08498c2f0d210a11d9a5e71fa62ac6005d06a02a6e4fa92ef42941bc7304d4f9.scope - libcontainer container 08498c2f0d210a11d9a5e71fa62ac6005d06a02a6e4fa92ef42941bc7304d4f9. Jan 27 04:46:59.606288 systemd[1]: Started cri-containerd-faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8.scope - libcontainer container faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8. Jan 27 04:46:59.607000 audit: BPF prog-id=106 op=LOAD Jan 27 04:46:59.608000 audit: BPF prog-id=107 op=LOAD Jan 27 04:46:59.608000 audit[2718]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2579 pid=2718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038343938633266306432313061313164396135653731666136326163 Jan 27 04:46:59.608000 audit: BPF prog-id=107 op=UNLOAD Jan 27 04:46:59.608000 audit[2718]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2579 pid=2718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038343938633266306432313061313164396135653731666136326163 Jan 27 04:46:59.608000 audit: BPF prog-id=108 op=LOAD Jan 27 04:46:59.608000 audit[2718]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2579 pid=2718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038343938633266306432313061313164396135653731666136326163 Jan 27 04:46:59.609000 audit: BPF prog-id=109 op=LOAD Jan 27 04:46:59.609000 audit[2718]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2579 pid=2718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.609000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038343938633266306432313061313164396135653731666136326163 Jan 27 04:46:59.609000 audit: BPF prog-id=109 op=UNLOAD Jan 27 04:46:59.609000 audit[2718]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2579 pid=2718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.609000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038343938633266306432313061313164396135653731666136326163 Jan 27 04:46:59.609000 audit: BPF prog-id=108 op=UNLOAD Jan 27 04:46:59.609000 audit[2718]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2579 pid=2718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.609000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038343938633266306432313061313164396135653731666136326163 Jan 27 04:46:59.609000 audit: BPF prog-id=110 op=LOAD Jan 27 04:46:59.609000 audit[2718]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2579 pid=2718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.609000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038343938633266306432313061313164396135653731666136326163 Jan 27 04:46:59.617773 containerd[1666]: time="2026-01-27T04:46:59.617734722Z" level=info msg="StartContainer for \"8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379\" returns successfully" Jan 27 04:46:59.619000 audit: BPF prog-id=111 op=LOAD Jan 27 04:46:59.620000 audit: BPF prog-id=112 op=LOAD Jan 27 04:46:59.620000 audit[2737]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001f0180 a2=98 a3=0 items=0 ppid=2582 pid=2737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661656365623766363530636165366132666132633165653662313130 Jan 27 04:46:59.620000 audit: BPF prog-id=112 op=UNLOAD Jan 27 04:46:59.620000 audit[2737]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661656365623766363530636165366132666132633165653662313130 Jan 27 04:46:59.620000 audit: BPF prog-id=113 op=LOAD Jan 27 04:46:59.620000 audit[2737]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001f03e8 a2=98 a3=0 items=0 ppid=2582 pid=2737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661656365623766363530636165366132666132633165653662313130 Jan 27 04:46:59.620000 audit: BPF prog-id=114 op=LOAD Jan 27 04:46:59.620000 audit[2737]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001f0168 a2=98 a3=0 items=0 ppid=2582 pid=2737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661656365623766363530636165366132666132633165653662313130 Jan 27 04:46:59.620000 audit: BPF prog-id=114 op=UNLOAD Jan 27 04:46:59.620000 audit[2737]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661656365623766363530636165366132666132633165653662313130 Jan 27 04:46:59.620000 audit: BPF prog-id=113 op=UNLOAD Jan 27 04:46:59.620000 audit[2737]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661656365623766363530636165366132666132633165653662313130 Jan 27 04:46:59.620000 audit: BPF prog-id=115 op=LOAD Jan 27 04:46:59.620000 audit[2737]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001f0648 a2=98 a3=0 items=0 ppid=2582 pid=2737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:46:59.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661656365623766363530636165366132666132633165653662313130 Jan 27 04:46:59.646543 containerd[1666]: time="2026-01-27T04:46:59.646434668Z" level=info msg="StartContainer for \"08498c2f0d210a11d9a5e71fa62ac6005d06a02a6e4fa92ef42941bc7304d4f9\" returns successfully" Jan 27 04:46:59.659916 containerd[1666]: time="2026-01-27T04:46:59.659847577Z" level=info msg="StartContainer for \"faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8\" returns successfully" Jan 27 04:46:59.691633 kubelet[2531]: I0127 04:46:59.691522 2531 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.691936 kubelet[2531]: E0127 04:46:59.691913 2531 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.32:6443/api/v1/nodes\": dial tcp 10.0.3.32:6443: connect: connection refused" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.953438 kubelet[2531]: E0127 04:46:59.953400 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.956358 kubelet[2531]: E0127 04:46:59.956336 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:46:59.957796 kubelet[2531]: E0127 04:46:59.957779 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:00.495104 kubelet[2531]: I0127 04:47:00.493665 2531 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:00.962212 kubelet[2531]: E0127 04:47:00.960866 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:00.963036 kubelet[2531]: E0127 04:47:00.962868 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:00.963036 kubelet[2531]: E0127 04:47:00.962909 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.316827 kubelet[2531]: E0127 04:47:01.316705 2531 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4592-0-0-n-c2731c5fad\" not found" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.377041 kubelet[2531]: I0127 04:47:01.376744 2531 kubelet_node_status.go:78] "Successfully registered node" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.419790 kubelet[2531]: I0127 04:47:01.419739 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.477189 kubelet[2531]: E0127 04:47:01.477124 2531 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4592-0-0-n-c2731c5fad\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.477189 kubelet[2531]: I0127 04:47:01.477160 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.479300 kubelet[2531]: E0127 04:47:01.479269 2531 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.479300 kubelet[2531]: I0127 04:47:01.479297 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.480844 kubelet[2531]: E0127 04:47:01.480804 2531 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4592-0-0-n-c2731c5fad\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.901225 kubelet[2531]: I0127 04:47:01.901075 2531 apiserver.go:52] "Watching apiserver" Jan 27 04:47:01.919908 kubelet[2531]: I0127 04:47:01.919873 2531 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 27 04:47:01.961213 kubelet[2531]: I0127 04:47:01.961186 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.961361 kubelet[2531]: I0127 04:47:01.961285 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.963205 kubelet[2531]: E0127 04:47:01.963170 2531 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4592-0-0-n-c2731c5fad\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:01.963205 kubelet[2531]: E0127 04:47:01.963169 2531 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4592-0-0-n-c2731c5fad\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:03.271313 systemd[1]: Reload requested from client PID 2814 ('systemctl') (unit session-10.scope)... Jan 27 04:47:03.271330 systemd[1]: Reloading... Jan 27 04:47:03.333115 zram_generator::config[2860]: No configuration found. Jan 27 04:47:03.526757 systemd[1]: Reloading finished in 255 ms. Jan 27 04:47:03.557850 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 04:47:03.573434 systemd[1]: kubelet.service: Deactivated successfully. Jan 27 04:47:03.573810 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 04:47:03.573889 systemd[1]: kubelet.service: Consumed 1.960s CPU time, 129.5M memory peak. Jan 27 04:47:03.572000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:47:03.575695 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 27 04:47:03.575000 audit: BPF prog-id=116 op=LOAD Jan 27 04:47:03.575000 audit: BPF prog-id=81 op=UNLOAD Jan 27 04:47:03.575000 audit: BPF prog-id=117 op=LOAD Jan 27 04:47:03.576000 audit: BPF prog-id=85 op=UNLOAD Jan 27 04:47:03.577000 audit: BPF prog-id=118 op=LOAD Jan 27 04:47:03.577000 audit: BPF prog-id=78 op=UNLOAD Jan 27 04:47:03.577000 audit: BPF prog-id=119 op=LOAD Jan 27 04:47:03.577000 audit: BPF prog-id=120 op=LOAD Jan 27 04:47:03.577000 audit: BPF prog-id=79 op=UNLOAD Jan 27 04:47:03.577000 audit: BPF prog-id=80 op=UNLOAD Jan 27 04:47:03.577000 audit: BPF prog-id=121 op=LOAD Jan 27 04:47:03.577000 audit: BPF prog-id=122 op=LOAD Jan 27 04:47:03.578000 audit: BPF prog-id=70 op=UNLOAD Jan 27 04:47:03.578000 audit: BPF prog-id=71 op=UNLOAD Jan 27 04:47:03.588000 audit: BPF prog-id=123 op=LOAD Jan 27 04:47:03.588000 audit: BPF prog-id=66 op=UNLOAD Jan 27 04:47:03.589000 audit: BPF prog-id=124 op=LOAD Jan 27 04:47:03.589000 audit: BPF prog-id=72 op=UNLOAD Jan 27 04:47:03.589000 audit: BPF prog-id=125 op=LOAD Jan 27 04:47:03.589000 audit: BPF prog-id=126 op=LOAD Jan 27 04:47:03.589000 audit: BPF prog-id=73 op=UNLOAD Jan 27 04:47:03.589000 audit: BPF prog-id=74 op=UNLOAD Jan 27 04:47:03.590000 audit: BPF prog-id=127 op=LOAD Jan 27 04:47:03.590000 audit: BPF prog-id=82 op=UNLOAD Jan 27 04:47:03.590000 audit: BPF prog-id=128 op=LOAD Jan 27 04:47:03.590000 audit: BPF prog-id=129 op=LOAD Jan 27 04:47:03.590000 audit: BPF prog-id=83 op=UNLOAD Jan 27 04:47:03.590000 audit: BPF prog-id=84 op=UNLOAD Jan 27 04:47:03.590000 audit: BPF prog-id=130 op=LOAD Jan 27 04:47:03.590000 audit: BPF prog-id=75 op=UNLOAD Jan 27 04:47:03.590000 audit: BPF prog-id=131 op=LOAD Jan 27 04:47:03.590000 audit: BPF prog-id=132 op=LOAD Jan 27 04:47:03.591000 audit: BPF prog-id=76 op=UNLOAD Jan 27 04:47:03.591000 audit: BPF prog-id=77 op=UNLOAD Jan 27 04:47:03.591000 audit: BPF prog-id=133 op=LOAD Jan 27 04:47:03.591000 audit: BPF prog-id=67 op=UNLOAD Jan 27 04:47:03.591000 audit: BPF prog-id=134 op=LOAD Jan 27 04:47:03.591000 audit: BPF prog-id=135 op=LOAD Jan 27 04:47:03.591000 audit: BPF prog-id=68 op=UNLOAD Jan 27 04:47:03.591000 audit: BPF prog-id=69 op=UNLOAD Jan 27 04:47:03.733716 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 27 04:47:03.732000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:47:03.739077 (kubelet)[2905]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 27 04:47:03.775901 kubelet[2905]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 04:47:03.775901 kubelet[2905]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 27 04:47:03.775901 kubelet[2905]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 04:47:03.775901 kubelet[2905]: I0127 04:47:03.773766 2905 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 04:47:03.781879 kubelet[2905]: I0127 04:47:03.781763 2905 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 27 04:47:03.781879 kubelet[2905]: I0127 04:47:03.781796 2905 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 04:47:03.782062 kubelet[2905]: I0127 04:47:03.782043 2905 server.go:954] "Client rotation is on, will bootstrap in background" Jan 27 04:47:03.783985 kubelet[2905]: I0127 04:47:03.783952 2905 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 27 04:47:03.786321 kubelet[2905]: I0127 04:47:03.786259 2905 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 27 04:47:03.792005 kubelet[2905]: I0127 04:47:03.791981 2905 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 04:47:03.794529 kubelet[2905]: I0127 04:47:03.794504 2905 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 27 04:47:03.795159 kubelet[2905]: I0127 04:47:03.795119 2905 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 04:47:03.795323 kubelet[2905]: I0127 04:47:03.795162 2905 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4592-0-0-n-c2731c5fad","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 04:47:03.795397 kubelet[2905]: I0127 04:47:03.795332 2905 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 04:47:03.795397 kubelet[2905]: I0127 04:47:03.795341 2905 container_manager_linux.go:304] "Creating device plugin manager" Jan 27 04:47:03.795397 kubelet[2905]: I0127 04:47:03.795387 2905 state_mem.go:36] "Initialized new in-memory state store" Jan 27 04:47:03.795543 kubelet[2905]: I0127 04:47:03.795530 2905 kubelet.go:446] "Attempting to sync node with API server" Jan 27 04:47:03.795543 kubelet[2905]: I0127 04:47:03.795543 2905 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 04:47:03.795598 kubelet[2905]: I0127 04:47:03.795565 2905 kubelet.go:352] "Adding apiserver pod source" Jan 27 04:47:03.795598 kubelet[2905]: I0127 04:47:03.795577 2905 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 04:47:03.798796 kubelet[2905]: I0127 04:47:03.797367 2905 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 27 04:47:03.800765 kubelet[2905]: I0127 04:47:03.800723 2905 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 27 04:47:03.801966 kubelet[2905]: I0127 04:47:03.801857 2905 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 27 04:47:03.801966 kubelet[2905]: I0127 04:47:03.801897 2905 server.go:1287] "Started kubelet" Jan 27 04:47:03.804109 kubelet[2905]: I0127 04:47:03.804066 2905 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 04:47:03.805278 kubelet[2905]: I0127 04:47:03.804693 2905 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 04:47:03.805343 kubelet[2905]: I0127 04:47:03.805292 2905 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 04:47:03.805617 kubelet[2905]: I0127 04:47:03.805589 2905 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 04:47:03.805863 kubelet[2905]: I0127 04:47:03.805838 2905 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 27 04:47:03.806139 kubelet[2905]: I0127 04:47:03.806117 2905 server.go:479] "Adding debug handlers to kubelet server" Jan 27 04:47:03.811250 kubelet[2905]: I0127 04:47:03.808857 2905 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 27 04:47:03.811250 kubelet[2905]: E0127 04:47:03.809080 2905 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4592-0-0-n-c2731c5fad\" not found" Jan 27 04:47:03.811250 kubelet[2905]: I0127 04:47:03.809584 2905 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 27 04:47:03.811250 kubelet[2905]: I0127 04:47:03.809690 2905 reconciler.go:26] "Reconciler: start to sync state" Jan 27 04:47:03.812315 kubelet[2905]: I0127 04:47:03.812276 2905 factory.go:221] Registration of the systemd container factory successfully Jan 27 04:47:03.812922 kubelet[2905]: I0127 04:47:03.812386 2905 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 27 04:47:03.815443 kubelet[2905]: I0127 04:47:03.815418 2905 factory.go:221] Registration of the containerd container factory successfully Jan 27 04:47:03.822161 kubelet[2905]: E0127 04:47:03.822045 2905 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 27 04:47:03.822362 kubelet[2905]: I0127 04:47:03.822329 2905 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 27 04:47:03.823796 kubelet[2905]: I0127 04:47:03.823773 2905 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 27 04:47:03.823927 kubelet[2905]: I0127 04:47:03.823915 2905 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 27 04:47:03.824011 kubelet[2905]: I0127 04:47:03.823999 2905 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 27 04:47:03.824055 kubelet[2905]: I0127 04:47:03.824047 2905 kubelet.go:2382] "Starting kubelet main sync loop" Jan 27 04:47:03.824192 kubelet[2905]: E0127 04:47:03.824175 2905 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 04:47:03.862914 kubelet[2905]: I0127 04:47:03.862888 2905 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 27 04:47:03.863107 kubelet[2905]: I0127 04:47:03.863026 2905 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 27 04:47:03.863107 kubelet[2905]: I0127 04:47:03.863050 2905 state_mem.go:36] "Initialized new in-memory state store" Jan 27 04:47:03.863331 kubelet[2905]: I0127 04:47:03.863313 2905 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 27 04:47:03.863399 kubelet[2905]: I0127 04:47:03.863376 2905 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 27 04:47:03.863443 kubelet[2905]: I0127 04:47:03.863435 2905 policy_none.go:49] "None policy: Start" Jan 27 04:47:03.863488 kubelet[2905]: I0127 04:47:03.863480 2905 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 27 04:47:03.863536 kubelet[2905]: I0127 04:47:03.863528 2905 state_mem.go:35] "Initializing new in-memory state store" Jan 27 04:47:03.863689 kubelet[2905]: I0127 04:47:03.863676 2905 state_mem.go:75] "Updated machine memory state" Jan 27 04:47:03.867289 kubelet[2905]: I0127 04:47:03.867230 2905 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 27 04:47:03.867396 kubelet[2905]: I0127 04:47:03.867379 2905 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 04:47:03.867450 kubelet[2905]: I0127 04:47:03.867397 2905 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 04:47:03.867633 kubelet[2905]: I0127 04:47:03.867612 2905 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 04:47:03.868304 kubelet[2905]: E0127 04:47:03.868281 2905 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 27 04:47:03.925396 kubelet[2905]: I0127 04:47:03.925355 2905 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:03.925629 kubelet[2905]: I0127 04:47:03.925355 2905 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:03.925736 kubelet[2905]: I0127 04:47:03.925491 2905 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:03.970921 kubelet[2905]: I0127 04:47:03.970886 2905 kubelet_node_status.go:75] "Attempting to register node" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:03.981007 kubelet[2905]: I0127 04:47:03.980920 2905 kubelet_node_status.go:124] "Node was previously registered" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:03.981007 kubelet[2905]: I0127 04:47:03.981020 2905 kubelet_node_status.go:78] "Successfully registered node" node="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.011405 kubelet[2905]: I0127 04:47:04.011221 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9f0a1cead54afc0be00aa20c32026bef-ca-certs\") pod \"kube-apiserver-ci-4592-0-0-n-c2731c5fad\" (UID: \"9f0a1cead54afc0be00aa20c32026bef\") " pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.011405 kubelet[2905]: I0127 04:47:04.011266 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9f0a1cead54afc0be00aa20c32026bef-k8s-certs\") pod \"kube-apiserver-ci-4592-0-0-n-c2731c5fad\" (UID: \"9f0a1cead54afc0be00aa20c32026bef\") " pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.011405 kubelet[2905]: I0127 04:47:04.011284 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-flexvolume-dir\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.011405 kubelet[2905]: I0127 04:47:04.011303 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-k8s-certs\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.011405 kubelet[2905]: I0127 04:47:04.011319 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.011808 kubelet[2905]: I0127 04:47:04.011336 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4d3e8bbb89caa8d20e7e50b2f3a936f9-kubeconfig\") pod \"kube-scheduler-ci-4592-0-0-n-c2731c5fad\" (UID: \"4d3e8bbb89caa8d20e7e50b2f3a936f9\") " pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.011808 kubelet[2905]: I0127 04:47:04.011352 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9f0a1cead54afc0be00aa20c32026bef-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4592-0-0-n-c2731c5fad\" (UID: \"9f0a1cead54afc0be00aa20c32026bef\") " pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.011808 kubelet[2905]: I0127 04:47:04.011369 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-ca-certs\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.011808 kubelet[2905]: I0127 04:47:04.011393 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fed1eea61a1c438004a60752c1c3b064-kubeconfig\") pod \"kube-controller-manager-ci-4592-0-0-n-c2731c5fad\" (UID: \"fed1eea61a1c438004a60752c1c3b064\") " pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.797893 kubelet[2905]: I0127 04:47:04.797533 2905 apiserver.go:52] "Watching apiserver" Jan 27 04:47:04.809828 kubelet[2905]: I0127 04:47:04.809769 2905 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 27 04:47:04.850822 kubelet[2905]: I0127 04:47:04.850461 2905 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.850822 kubelet[2905]: I0127 04:47:04.850628 2905 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.860533 kubelet[2905]: E0127 04:47:04.860498 2905 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4592-0-0-n-c2731c5fad\" already exists" pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.860803 kubelet[2905]: E0127 04:47:04.860498 2905 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4592-0-0-n-c2731c5fad\" already exists" pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:04.880295 kubelet[2905]: I0127 04:47:04.880154 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4592-0-0-n-c2731c5fad" podStartSLOduration=1.8799766660000001 podStartE2EDuration="1.879976666s" podCreationTimestamp="2026-01-27 04:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 04:47:04.879783465 +0000 UTC m=+1.137394918" watchObservedRunningTime="2026-01-27 04:47:04.879976666 +0000 UTC m=+1.137588119" Jan 27 04:47:04.880612 kubelet[2905]: I0127 04:47:04.880482 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4592-0-0-n-c2731c5fad" podStartSLOduration=1.8804740290000002 podStartE2EDuration="1.880474029s" podCreationTimestamp="2026-01-27 04:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 04:47:04.870000615 +0000 UTC m=+1.127612068" watchObservedRunningTime="2026-01-27 04:47:04.880474029 +0000 UTC m=+1.138085482" Jan 27 04:47:04.900327 kubelet[2905]: I0127 04:47:04.900267 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4592-0-0-n-c2731c5fad" podStartSLOduration=1.900250649 podStartE2EDuration="1.900250649s" podCreationTimestamp="2026-01-27 04:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 04:47:04.889712476 +0000 UTC m=+1.147323929" watchObservedRunningTime="2026-01-27 04:47:04.900250649 +0000 UTC m=+1.157862102" Jan 27 04:47:05.702018 update_engine[1654]: I20260127 04:47:05.701872 1654 update_attempter.cc:509] Updating boot flags... Jan 27 04:47:09.234287 kubelet[2905]: I0127 04:47:09.234255 2905 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 27 04:47:09.234674 containerd[1666]: time="2026-01-27T04:47:09.234531482Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 27 04:47:09.235141 kubelet[2905]: I0127 04:47:09.234943 2905 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 27 04:47:09.901420 systemd[1]: Created slice kubepods-besteffort-pode905d33d_10a2_4704_962d_0ee7b053cf58.slice - libcontainer container kubepods-besteffort-pode905d33d_10a2_4704_962d_0ee7b053cf58.slice. Jan 27 04:47:09.950621 kubelet[2905]: I0127 04:47:09.950571 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dktl\" (UniqueName: \"kubernetes.io/projected/e905d33d-10a2-4704-962d-0ee7b053cf58-kube-api-access-4dktl\") pod \"kube-proxy-mg2p2\" (UID: \"e905d33d-10a2-4704-962d-0ee7b053cf58\") " pod="kube-system/kube-proxy-mg2p2" Jan 27 04:47:09.950621 kubelet[2905]: I0127 04:47:09.950617 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e905d33d-10a2-4704-962d-0ee7b053cf58-kube-proxy\") pod \"kube-proxy-mg2p2\" (UID: \"e905d33d-10a2-4704-962d-0ee7b053cf58\") " pod="kube-system/kube-proxy-mg2p2" Jan 27 04:47:09.950781 kubelet[2905]: I0127 04:47:09.950639 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e905d33d-10a2-4704-962d-0ee7b053cf58-xtables-lock\") pod \"kube-proxy-mg2p2\" (UID: \"e905d33d-10a2-4704-962d-0ee7b053cf58\") " pod="kube-system/kube-proxy-mg2p2" Jan 27 04:47:09.950781 kubelet[2905]: I0127 04:47:09.950654 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e905d33d-10a2-4704-962d-0ee7b053cf58-lib-modules\") pod \"kube-proxy-mg2p2\" (UID: \"e905d33d-10a2-4704-962d-0ee7b053cf58\") " pod="kube-system/kube-proxy-mg2p2" Jan 27 04:47:10.059642 kubelet[2905]: E0127 04:47:10.059611 2905 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Jan 27 04:47:10.059642 kubelet[2905]: E0127 04:47:10.059640 2905 projected.go:194] Error preparing data for projected volume kube-api-access-4dktl for pod kube-system/kube-proxy-mg2p2: configmap "kube-root-ca.crt" not found Jan 27 04:47:10.059782 kubelet[2905]: E0127 04:47:10.059696 2905 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e905d33d-10a2-4704-962d-0ee7b053cf58-kube-api-access-4dktl podName:e905d33d-10a2-4704-962d-0ee7b053cf58 nodeName:}" failed. No retries permitted until 2026-01-27 04:47:10.559676651 +0000 UTC m=+6.817288104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4dktl" (UniqueName: "kubernetes.io/projected/e905d33d-10a2-4704-962d-0ee7b053cf58-kube-api-access-4dktl") pod "kube-proxy-mg2p2" (UID: "e905d33d-10a2-4704-962d-0ee7b053cf58") : configmap "kube-root-ca.crt" not found Jan 27 04:47:10.321871 systemd[1]: Created slice kubepods-besteffort-pode38e7754_e013_45ac_9d88_ca7e4c7a3653.slice - libcontainer container kubepods-besteffort-pode38e7754_e013_45ac_9d88_ca7e4c7a3653.slice. Jan 27 04:47:10.353687 kubelet[2905]: I0127 04:47:10.353596 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk7rp\" (UniqueName: \"kubernetes.io/projected/e38e7754-e013-45ac-9d88-ca7e4c7a3653-kube-api-access-kk7rp\") pod \"tigera-operator-7dcd859c48-49b9p\" (UID: \"e38e7754-e013-45ac-9d88-ca7e4c7a3653\") " pod="tigera-operator/tigera-operator-7dcd859c48-49b9p" Jan 27 04:47:10.353687 kubelet[2905]: I0127 04:47:10.353652 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e38e7754-e013-45ac-9d88-ca7e4c7a3653-var-lib-calico\") pod \"tigera-operator-7dcd859c48-49b9p\" (UID: \"e38e7754-e013-45ac-9d88-ca7e4c7a3653\") " pod="tigera-operator/tigera-operator-7dcd859c48-49b9p" Jan 27 04:47:10.626873 containerd[1666]: time="2026-01-27T04:47:10.626732464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-49b9p,Uid:e38e7754-e013-45ac-9d88-ca7e4c7a3653,Namespace:tigera-operator,Attempt:0,}" Jan 27 04:47:10.660552 containerd[1666]: time="2026-01-27T04:47:10.659161070Z" level=info msg="connecting to shim 462971cd0ca9b64c54265c1d210e0f530dbcad764ab4382b080bde39bff7a456" address="unix:///run/containerd/s/a6e460059dce338f750180ed762b1e57675fc463872533e19d5269084708bebb" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:10.681316 systemd[1]: Started cri-containerd-462971cd0ca9b64c54265c1d210e0f530dbcad764ab4382b080bde39bff7a456.scope - libcontainer container 462971cd0ca9b64c54265c1d210e0f530dbcad764ab4382b080bde39bff7a456. Jan 27 04:47:10.691000 audit: BPF prog-id=136 op=LOAD Jan 27 04:47:10.692592 kernel: kauditd_printk_skb: 200 callbacks suppressed Jan 27 04:47:10.692672 kernel: audit: type=1334 audit(1769489230.691:451): prog-id=136 op=LOAD Jan 27 04:47:10.692703 kernel: audit: type=1334 audit(1769489230.691:452): prog-id=137 op=LOAD Jan 27 04:47:10.691000 audit: BPF prog-id=137 op=LOAD Jan 27 04:47:10.691000 audit[2989]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.696132 kernel: audit: type=1300 audit(1769489230.691:452): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.696231 kernel: audit: type=1327 audit(1769489230.691:452): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.691000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.698840 kernel: audit: type=1334 audit(1769489230.692:453): prog-id=137 op=UNLOAD Jan 27 04:47:10.692000 audit: BPF prog-id=137 op=UNLOAD Jan 27 04:47:10.692000 audit[2989]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.702414 kernel: audit: type=1300 audit(1769489230.692:453): arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.702477 kernel: audit: type=1327 audit(1769489230.692:453): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.692000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.705136 kernel: audit: type=1334 audit(1769489230.692:454): prog-id=138 op=LOAD Jan 27 04:47:10.692000 audit: BPF prog-id=138 op=LOAD Jan 27 04:47:10.705786 kernel: audit: type=1300 audit(1769489230.692:454): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.692000 audit[2989]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.708652 kernel: audit: type=1327 audit(1769489230.692:454): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.692000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.693000 audit: BPF prog-id=139 op=LOAD Jan 27 04:47:10.693000 audit[2989]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.695000 audit: BPF prog-id=139 op=UNLOAD Jan 27 04:47:10.695000 audit[2989]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.695000 audit: BPF prog-id=138 op=UNLOAD Jan 27 04:47:10.695000 audit[2989]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.696000 audit: BPF prog-id=140 op=LOAD Jan 27 04:47:10.696000 audit[2989]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2978 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436323937316364306361396236346335343236356331643231306530 Jan 27 04:47:10.730220 containerd[1666]: time="2026-01-27T04:47:10.730178992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-49b9p,Uid:e38e7754-e013-45ac-9d88-ca7e4c7a3653,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"462971cd0ca9b64c54265c1d210e0f530dbcad764ab4382b080bde39bff7a456\"" Jan 27 04:47:10.732470 containerd[1666]: time="2026-01-27T04:47:10.732437724Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 27 04:47:10.822391 containerd[1666]: time="2026-01-27T04:47:10.822324582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mg2p2,Uid:e905d33d-10a2-4704-962d-0ee7b053cf58,Namespace:kube-system,Attempt:0,}" Jan 27 04:47:10.864819 containerd[1666]: time="2026-01-27T04:47:10.864624078Z" level=info msg="connecting to shim b2429552ba7f369c5a97af50143c2aba135d0e4a61227a567d7d92918bdb0ca7" address="unix:///run/containerd/s/88a853ba5d1035b4c4129bb470b4d0c4cf48a9f977311c0820dd3d37895db0da" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:10.888408 systemd[1]: Started cri-containerd-b2429552ba7f369c5a97af50143c2aba135d0e4a61227a567d7d92918bdb0ca7.scope - libcontainer container b2429552ba7f369c5a97af50143c2aba135d0e4a61227a567d7d92918bdb0ca7. Jan 27 04:47:10.897000 audit: BPF prog-id=141 op=LOAD Jan 27 04:47:10.897000 audit: BPF prog-id=142 op=LOAD Jan 27 04:47:10.897000 audit[3035]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3023 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232343239353532626137663336396335613937616635303134336332 Jan 27 04:47:10.897000 audit: BPF prog-id=142 op=UNLOAD Jan 27 04:47:10.897000 audit[3035]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232343239353532626137663336396335613937616635303134336332 Jan 27 04:47:10.897000 audit: BPF prog-id=143 op=LOAD Jan 27 04:47:10.897000 audit[3035]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3023 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232343239353532626137663336396335613937616635303134336332 Jan 27 04:47:10.897000 audit: BPF prog-id=144 op=LOAD Jan 27 04:47:10.897000 audit[3035]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3023 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232343239353532626137663336396335613937616635303134336332 Jan 27 04:47:10.897000 audit: BPF prog-id=144 op=UNLOAD Jan 27 04:47:10.897000 audit[3035]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232343239353532626137663336396335613937616635303134336332 Jan 27 04:47:10.897000 audit: BPF prog-id=143 op=UNLOAD Jan 27 04:47:10.897000 audit[3035]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232343239353532626137663336396335613937616635303134336332 Jan 27 04:47:10.897000 audit: BPF prog-id=145 op=LOAD Jan 27 04:47:10.897000 audit[3035]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3023 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:10.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6232343239353532626137663336396335613937616635303134336332 Jan 27 04:47:10.911648 containerd[1666]: time="2026-01-27T04:47:10.911471677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mg2p2,Uid:e905d33d-10a2-4704-962d-0ee7b053cf58,Namespace:kube-system,Attempt:0,} returns sandbox id \"b2429552ba7f369c5a97af50143c2aba135d0e4a61227a567d7d92918bdb0ca7\"" Jan 27 04:47:10.915059 containerd[1666]: time="2026-01-27T04:47:10.915022615Z" level=info msg="CreateContainer within sandbox \"b2429552ba7f369c5a97af50143c2aba135d0e4a61227a567d7d92918bdb0ca7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 27 04:47:10.931564 containerd[1666]: time="2026-01-27T04:47:10.931521819Z" level=info msg="Container 3827b0d58df813132d8b40b2829c4866eaa2f187b1c1d938c95ecb488111b1f2: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:47:10.945216 containerd[1666]: time="2026-01-27T04:47:10.945174969Z" level=info msg="CreateContainer within sandbox \"b2429552ba7f369c5a97af50143c2aba135d0e4a61227a567d7d92918bdb0ca7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"3827b0d58df813132d8b40b2829c4866eaa2f187b1c1d938c95ecb488111b1f2\"" Jan 27 04:47:10.946310 containerd[1666]: time="2026-01-27T04:47:10.946242575Z" level=info msg="StartContainer for \"3827b0d58df813132d8b40b2829c4866eaa2f187b1c1d938c95ecb488111b1f2\"" Jan 27 04:47:10.947846 containerd[1666]: time="2026-01-27T04:47:10.947819583Z" level=info msg="connecting to shim 3827b0d58df813132d8b40b2829c4866eaa2f187b1c1d938c95ecb488111b1f2" address="unix:///run/containerd/s/88a853ba5d1035b4c4129bb470b4d0c4cf48a9f977311c0820dd3d37895db0da" protocol=ttrpc version=3 Jan 27 04:47:10.972303 systemd[1]: Started cri-containerd-3827b0d58df813132d8b40b2829c4866eaa2f187b1c1d938c95ecb488111b1f2.scope - libcontainer container 3827b0d58df813132d8b40b2829c4866eaa2f187b1c1d938c95ecb488111b1f2. Jan 27 04:47:11.046000 audit: BPF prog-id=146 op=LOAD Jan 27 04:47:11.046000 audit[3060]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=3023 pid=3060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.046000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323762306435386466383133313332643862343062323832396334 Jan 27 04:47:11.047000 audit: BPF prog-id=147 op=LOAD Jan 27 04:47:11.047000 audit[3060]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=3023 pid=3060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323762306435386466383133313332643862343062323832396334 Jan 27 04:47:11.047000 audit: BPF prog-id=147 op=UNLOAD Jan 27 04:47:11.047000 audit[3060]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323762306435386466383133313332643862343062323832396334 Jan 27 04:47:11.047000 audit: BPF prog-id=146 op=UNLOAD Jan 27 04:47:11.047000 audit[3060]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323762306435386466383133313332643862343062323832396334 Jan 27 04:47:11.047000 audit: BPF prog-id=148 op=LOAD Jan 27 04:47:11.047000 audit[3060]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=3023 pid=3060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338323762306435386466383133313332643862343062323832396334 Jan 27 04:47:11.071171 containerd[1666]: time="2026-01-27T04:47:11.071133852Z" level=info msg="StartContainer for \"3827b0d58df813132d8b40b2829c4866eaa2f187b1c1d938c95ecb488111b1f2\" returns successfully" Jan 27 04:47:11.213000 audit[3122]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3122 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.213000 audit[3122]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd6803b50 a2=0 a3=1 items=0 ppid=3073 pid=3122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.213000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 27 04:47:11.213000 audit[3123]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.213000 audit[3123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff06902b0 a2=0 a3=1 items=0 ppid=3073 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.213000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 27 04:47:11.214000 audit[3124]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3124 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.214000 audit[3124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff856b330 a2=0 a3=1 items=0 ppid=3073 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.214000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 27 04:47:11.215000 audit[3126]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3126 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.215000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc3479ee0 a2=0 a3=1 items=0 ppid=3073 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.215000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 27 04:47:11.216000 audit[3127]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.216000 audit[3127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffec9ce4a0 a2=0 a3=1 items=0 ppid=3073 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.216000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 27 04:47:11.218000 audit[3128]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.218000 audit[3128]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd9af4df0 a2=0 a3=1 items=0 ppid=3073 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.218000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 27 04:47:11.315000 audit[3129]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.315000 audit[3129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffe60279c0 a2=0 a3=1 items=0 ppid=3073 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.315000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 27 04:47:11.318000 audit[3131]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3131 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.318000 audit[3131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffff6fe1ac0 a2=0 a3=1 items=0 ppid=3073 pid=3131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.318000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 27 04:47:11.321000 audit[3134]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.321000 audit[3134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffff7710f50 a2=0 a3=1 items=0 ppid=3073 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.321000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 27 04:47:11.322000 audit[3135]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3135 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.322000 audit[3135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff22c2c70 a2=0 a3=1 items=0 ppid=3073 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.322000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 27 04:47:11.324000 audit[3137]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.324000 audit[3137]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd8409bb0 a2=0 a3=1 items=0 ppid=3073 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.324000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 27 04:47:11.325000 audit[3138]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.325000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe997f480 a2=0 a3=1 items=0 ppid=3073 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.325000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 27 04:47:11.328000 audit[3140]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.328000 audit[3140]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffc3a4c270 a2=0 a3=1 items=0 ppid=3073 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.328000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 27 04:47:11.331000 audit[3143]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.331000 audit[3143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffcda7ecd0 a2=0 a3=1 items=0 ppid=3073 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.331000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 27 04:47:11.332000 audit[3144]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3144 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.332000 audit[3144]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc9db3b00 a2=0 a3=1 items=0 ppid=3073 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.332000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 27 04:47:11.334000 audit[3146]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.334000 audit[3146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdc727290 a2=0 a3=1 items=0 ppid=3073 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.334000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 27 04:47:11.335000 audit[3147]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.335000 audit[3147]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe58daa20 a2=0 a3=1 items=0 ppid=3073 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.335000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 27 04:47:11.337000 audit[3149]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.337000 audit[3149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd8cf2ad0 a2=0 a3=1 items=0 ppid=3073 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.337000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 27 04:47:11.341000 audit[3152]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.341000 audit[3152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffc7f26270 a2=0 a3=1 items=0 ppid=3073 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.341000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 27 04:47:11.344000 audit[3155]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.344000 audit[3155]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd74bec70 a2=0 a3=1 items=0 ppid=3073 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.344000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 27 04:47:11.345000 audit[3156]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3156 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.345000 audit[3156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc2e01cb0 a2=0 a3=1 items=0 ppid=3073 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.345000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 27 04:47:11.347000 audit[3158]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3158 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.347000 audit[3158]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=fffffe189180 a2=0 a3=1 items=0 ppid=3073 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.347000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 04:47:11.351000 audit[3161]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3161 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.351000 audit[3161]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe64daee0 a2=0 a3=1 items=0 ppid=3073 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.351000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 04:47:11.352000 audit[3162]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3162 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.352000 audit[3162]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffceff8eb0 a2=0 a3=1 items=0 ppid=3073 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.352000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 27 04:47:11.354000 audit[3164]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3164 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 27 04:47:11.354000 audit[3164]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffefc08300 a2=0 a3=1 items=0 ppid=3073 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.354000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 27 04:47:11.376000 audit[3170]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3170 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:11.376000 audit[3170]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffffa22ef80 a2=0 a3=1 items=0 ppid=3073 pid=3170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.376000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:11.389000 audit[3170]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3170 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:11.389000 audit[3170]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=fffffa22ef80 a2=0 a3=1 items=0 ppid=3073 pid=3170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.389000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:11.391000 audit[3175]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3175 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.391000 audit[3175]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffff5bd0d00 a2=0 a3=1 items=0 ppid=3073 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.391000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 27 04:47:11.395000 audit[3177]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3177 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.395000 audit[3177]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffda7e3560 a2=0 a3=1 items=0 ppid=3073 pid=3177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.395000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 27 04:47:11.399000 audit[3180]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3180 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.399000 audit[3180]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffff1284740 a2=0 a3=1 items=0 ppid=3073 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.399000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 27 04:47:11.400000 audit[3181]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3181 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.400000 audit[3181]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd730a700 a2=0 a3=1 items=0 ppid=3073 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.400000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 27 04:47:11.403000 audit[3183]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3183 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.403000 audit[3183]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff7928e00 a2=0 a3=1 items=0 ppid=3073 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.403000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 27 04:47:11.404000 audit[3184]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3184 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.404000 audit[3184]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff957c960 a2=0 a3=1 items=0 ppid=3073 pid=3184 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.404000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 27 04:47:11.406000 audit[3186]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.406000 audit[3186]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffff0fff7c0 a2=0 a3=1 items=0 ppid=3073 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.406000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 27 04:47:11.409000 audit[3189]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3189 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.409000 audit[3189]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffde6945c0 a2=0 a3=1 items=0 ppid=3073 pid=3189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.409000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 27 04:47:11.410000 audit[3190]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3190 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.410000 audit[3190]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff1b34990 a2=0 a3=1 items=0 ppid=3073 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.410000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 27 04:47:11.413000 audit[3192]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.413000 audit[3192]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff1808bc0 a2=0 a3=1 items=0 ppid=3073 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.413000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 27 04:47:11.414000 audit[3193]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.414000 audit[3193]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffddef0ce0 a2=0 a3=1 items=0 ppid=3073 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.414000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 27 04:47:11.416000 audit[3195]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3195 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.416000 audit[3195]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff910a590 a2=0 a3=1 items=0 ppid=3073 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.416000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 27 04:47:11.419000 audit[3198]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3198 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.419000 audit[3198]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff839d640 a2=0 a3=1 items=0 ppid=3073 pid=3198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.419000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 27 04:47:11.422000 audit[3201]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3201 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.422000 audit[3201]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcf5d85a0 a2=0 a3=1 items=0 ppid=3073 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.422000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 27 04:47:11.424000 audit[3202]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3202 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.424000 audit[3202]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd6e8aa80 a2=0 a3=1 items=0 ppid=3073 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.424000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 27 04:47:11.426000 audit[3204]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3204 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.426000 audit[3204]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffd2ea2900 a2=0 a3=1 items=0 ppid=3073 pid=3204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.426000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 04:47:11.429000 audit[3207]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3207 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.429000 audit[3207]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff20dae20 a2=0 a3=1 items=0 ppid=3073 pid=3207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.429000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 27 04:47:11.430000 audit[3208]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.430000 audit[3208]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc3bbbb00 a2=0 a3=1 items=0 ppid=3073 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.430000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 27 04:47:11.432000 audit[3210]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3210 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.432000 audit[3210]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffd13ac050 a2=0 a3=1 items=0 ppid=3073 pid=3210 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.432000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 27 04:47:11.433000 audit[3211]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3211 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.433000 audit[3211]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdace8060 a2=0 a3=1 items=0 ppid=3073 pid=3211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.433000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 27 04:47:11.436000 audit[3213]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.436000 audit[3213]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffe8271a00 a2=0 a3=1 items=0 ppid=3073 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.436000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 04:47:11.439000 audit[3216]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3216 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 27 04:47:11.439000 audit[3216]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=fffff0d27820 a2=0 a3=1 items=0 ppid=3073 pid=3216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.439000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 27 04:47:11.442000 audit[3218]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 27 04:47:11.442000 audit[3218]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffc8543710 a2=0 a3=1 items=0 ppid=3073 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.442000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:11.443000 audit[3218]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3218 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 27 04:47:11.443000 audit[3218]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffc8543710 a2=0 a3=1 items=0 ppid=3073 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:11.443000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:11.875941 kubelet[2905]: I0127 04:47:11.875882 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-mg2p2" podStartSLOduration=2.875865117 podStartE2EDuration="2.875865117s" podCreationTimestamp="2026-01-27 04:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 04:47:11.875639996 +0000 UTC m=+8.133251449" watchObservedRunningTime="2026-01-27 04:47:11.875865117 +0000 UTC m=+8.133476570" Jan 27 04:47:12.360242 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3895821963.mount: Deactivated successfully. Jan 27 04:47:12.666067 containerd[1666]: time="2026-01-27T04:47:12.665949108Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:12.672034 containerd[1666]: time="2026-01-27T04:47:12.671919219Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Jan 27 04:47:12.678109 containerd[1666]: time="2026-01-27T04:47:12.677961770Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:12.684228 containerd[1666]: time="2026-01-27T04:47:12.684116801Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:12.685211 containerd[1666]: time="2026-01-27T04:47:12.685160166Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 1.952680322s" Jan 27 04:47:12.685211 containerd[1666]: time="2026-01-27T04:47:12.685197046Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 27 04:47:12.688037 containerd[1666]: time="2026-01-27T04:47:12.688004261Z" level=info msg="CreateContainer within sandbox \"462971cd0ca9b64c54265c1d210e0f530dbcad764ab4382b080bde39bff7a456\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 27 04:47:12.706752 containerd[1666]: time="2026-01-27T04:47:12.706690116Z" level=info msg="Container e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:47:12.708387 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4111939006.mount: Deactivated successfully. Jan 27 04:47:12.725229 containerd[1666]: time="2026-01-27T04:47:12.725189050Z" level=info msg="CreateContainer within sandbox \"462971cd0ca9b64c54265c1d210e0f530dbcad764ab4382b080bde39bff7a456\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a\"" Jan 27 04:47:12.725681 containerd[1666]: time="2026-01-27T04:47:12.725639453Z" level=info msg="StartContainer for \"e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a\"" Jan 27 04:47:12.726525 containerd[1666]: time="2026-01-27T04:47:12.726496257Z" level=info msg="connecting to shim e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a" address="unix:///run/containerd/s/a6e460059dce338f750180ed762b1e57675fc463872533e19d5269084708bebb" protocol=ttrpc version=3 Jan 27 04:47:12.747521 systemd[1]: Started cri-containerd-e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a.scope - libcontainer container e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a. Jan 27 04:47:12.757000 audit: BPF prog-id=149 op=LOAD Jan 27 04:47:12.757000 audit: BPF prog-id=150 op=LOAD Jan 27 04:47:12.757000 audit[3227]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2978 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:12.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537396437616137343832643566346630343339663037353265373135 Jan 27 04:47:12.758000 audit: BPF prog-id=150 op=UNLOAD Jan 27 04:47:12.758000 audit[3227]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:12.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537396437616137343832643566346630343339663037353265373135 Jan 27 04:47:12.758000 audit: BPF prog-id=151 op=LOAD Jan 27 04:47:12.758000 audit[3227]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2978 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:12.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537396437616137343832643566346630343339663037353265373135 Jan 27 04:47:12.758000 audit: BPF prog-id=152 op=LOAD Jan 27 04:47:12.758000 audit[3227]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2978 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:12.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537396437616137343832643566346630343339663037353265373135 Jan 27 04:47:12.758000 audit: BPF prog-id=152 op=UNLOAD Jan 27 04:47:12.758000 audit[3227]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:12.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537396437616137343832643566346630343339663037353265373135 Jan 27 04:47:12.759000 audit: BPF prog-id=151 op=UNLOAD Jan 27 04:47:12.759000 audit[3227]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:12.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537396437616137343832643566346630343339663037353265373135 Jan 27 04:47:12.759000 audit: BPF prog-id=153 op=LOAD Jan 27 04:47:12.759000 audit[3227]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2978 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:12.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6537396437616137343832643566346630343339663037353265373135 Jan 27 04:47:12.776207 containerd[1666]: time="2026-01-27T04:47:12.776172671Z" level=info msg="StartContainer for \"e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a\" returns successfully" Jan 27 04:47:12.878875 kubelet[2905]: I0127 04:47:12.878787 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-49b9p" podStartSLOduration=0.924281982 podStartE2EDuration="2.878768594s" podCreationTimestamp="2026-01-27 04:47:10 +0000 UTC" firstStartedPulling="2026-01-27 04:47:10.731930641 +0000 UTC m=+6.989542094" lastFinishedPulling="2026-01-27 04:47:12.686417253 +0000 UTC m=+8.944028706" observedRunningTime="2026-01-27 04:47:12.878449872 +0000 UTC m=+9.136061325" watchObservedRunningTime="2026-01-27 04:47:12.878768594 +0000 UTC m=+9.136380047" Jan 27 04:47:18.118404 sudo[1952]: pam_unix(sudo:session): session closed for user root Jan 27 04:47:18.121245 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 27 04:47:18.121321 kernel: audit: type=1106 audit(1769489238.118:531): pid=1952 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:47:18.118000 audit[1952]: USER_END pid=1952 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:47:18.125652 kernel: audit: type=1104 audit(1769489238.118:532): pid=1952 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:47:18.118000 audit[1952]: CRED_DISP pid=1952 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 27 04:47:18.214132 sshd[1951]: Connection closed by 4.153.228.146 port 39974 Jan 27 04:47:18.213978 sshd-session[1947]: pam_unix(sshd:session): session closed for user core Jan 27 04:47:18.215000 audit[1947]: USER_END pid=1947 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:47:18.222030 systemd[1]: sshd@8-10.0.3.32:22-4.153.228.146:39974.service: Deactivated successfully. Jan 27 04:47:18.215000 audit[1947]: CRED_DISP pid=1947 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:47:18.226140 kernel: audit: type=1106 audit(1769489238.215:533): pid=1947 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:47:18.226196 kernel: audit: type=1104 audit(1769489238.215:534): pid=1947 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:47:18.226447 systemd[1]: session-10.scope: Deactivated successfully. Jan 27 04:47:18.226754 systemd[1]: session-10.scope: Consumed 6.929s CPU time, 223.9M memory peak. Jan 27 04:47:18.224000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.3.32:22-4.153.228.146:39974 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:47:18.229874 kernel: audit: type=1131 audit(1769489238.224:535): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.3.32:22-4.153.228.146:39974 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:47:18.228650 systemd-logind[1653]: Session 10 logged out. Waiting for processes to exit. Jan 27 04:47:18.230649 systemd-logind[1653]: Removed session 10. Jan 27 04:47:19.666000 audit[3323]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:19.666000 audit[3323]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffed6deca0 a2=0 a3=1 items=0 ppid=3073 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:19.672613 kernel: audit: type=1325 audit(1769489239.666:536): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:19.672677 kernel: audit: type=1300 audit(1769489239.666:536): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffed6deca0 a2=0 a3=1 items=0 ppid=3073 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:19.666000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:19.675371 kernel: audit: type=1327 audit(1769489239.666:536): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:19.674000 audit[3323]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:19.677121 kernel: audit: type=1325 audit(1769489239.674:537): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:19.674000 audit[3323]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffed6deca0 a2=0 a3=1 items=0 ppid=3073 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:19.682637 kernel: audit: type=1300 audit(1769489239.674:537): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffed6deca0 a2=0 a3=1 items=0 ppid=3073 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:19.674000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:19.697000 audit[3325]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:19.697000 audit[3325]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd57259b0 a2=0 a3=1 items=0 ppid=3073 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:19.697000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:19.704000 audit[3325]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:19.704000 audit[3325]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd57259b0 a2=0 a3=1 items=0 ppid=3073 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:19.704000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:23.448781 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 27 04:47:23.448868 kernel: audit: type=1325 audit(1769489243.444:540): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:23.444000 audit[3327]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:23.444000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffc63cafe0 a2=0 a3=1 items=0 ppid=3073 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:23.444000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:23.456817 kernel: audit: type=1300 audit(1769489243.444:540): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffc63cafe0 a2=0 a3=1 items=0 ppid=3073 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:23.456880 kernel: audit: type=1327 audit(1769489243.444:540): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:23.462000 audit[3327]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:23.462000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc63cafe0 a2=0 a3=1 items=0 ppid=3073 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:23.468019 kernel: audit: type=1325 audit(1769489243.462:541): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:23.468117 kernel: audit: type=1300 audit(1769489243.462:541): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc63cafe0 a2=0 a3=1 items=0 ppid=3073 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:23.462000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:23.470678 kernel: audit: type=1327 audit(1769489243.462:541): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:23.482000 audit[3329]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:23.482000 audit[3329]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff406ecd0 a2=0 a3=1 items=0 ppid=3073 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:23.487741 kernel: audit: type=1325 audit(1769489243.482:542): table=filter:111 family=2 entries=18 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:23.487814 kernel: audit: type=1300 audit(1769489243.482:542): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff406ecd0 a2=0 a3=1 items=0 ppid=3073 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:23.487835 kernel: audit: type=1327 audit(1769489243.482:542): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:23.482000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:23.491000 audit[3329]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:23.491000 audit[3329]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff406ecd0 a2=0 a3=1 items=0 ppid=3073 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:23.491000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:23.494125 kernel: audit: type=1325 audit(1769489243.491:543): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:24.501000 audit[3331]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:24.501000 audit[3331]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc50184b0 a2=0 a3=1 items=0 ppid=3073 pid=3331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:24.501000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:24.509000 audit[3331]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:24.509000 audit[3331]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc50184b0 a2=0 a3=1 items=0 ppid=3073 pid=3331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:24.509000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:25.839000 audit[3333]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3333 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:25.839000 audit[3333]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffd4e86d20 a2=0 a3=1 items=0 ppid=3073 pid=3333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:25.839000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:25.845000 audit[3333]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3333 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:25.845000 audit[3333]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd4e86d20 a2=0 a3=1 items=0 ppid=3073 pid=3333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:25.845000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:25.875363 systemd[1]: Created slice kubepods-besteffort-pod00771dea_761d_442b_a1ec_f4169b13909e.slice - libcontainer container kubepods-besteffort-pod00771dea_761d_442b_a1ec_f4169b13909e.slice. Jan 27 04:47:25.948536 kubelet[2905]: I0127 04:47:25.948489 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df589\" (UniqueName: \"kubernetes.io/projected/00771dea-761d-442b-a1ec-f4169b13909e-kube-api-access-df589\") pod \"calico-typha-5cc9894cdd-9qx49\" (UID: \"00771dea-761d-442b-a1ec-f4169b13909e\") " pod="calico-system/calico-typha-5cc9894cdd-9qx49" Jan 27 04:47:25.948536 kubelet[2905]: I0127 04:47:25.948542 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00771dea-761d-442b-a1ec-f4169b13909e-tigera-ca-bundle\") pod \"calico-typha-5cc9894cdd-9qx49\" (UID: \"00771dea-761d-442b-a1ec-f4169b13909e\") " pod="calico-system/calico-typha-5cc9894cdd-9qx49" Jan 27 04:47:25.948937 kubelet[2905]: I0127 04:47:25.948563 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/00771dea-761d-442b-a1ec-f4169b13909e-typha-certs\") pod \"calico-typha-5cc9894cdd-9qx49\" (UID: \"00771dea-761d-442b-a1ec-f4169b13909e\") " pod="calico-system/calico-typha-5cc9894cdd-9qx49" Jan 27 04:47:26.060938 systemd[1]: Created slice kubepods-besteffort-pod1595a63a_3432_48e7_8864_c656375ba257.slice - libcontainer container kubepods-besteffort-pod1595a63a_3432_48e7_8864_c656375ba257.slice. Jan 27 04:47:26.149281 kubelet[2905]: I0127 04:47:26.149104 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1595a63a-3432-48e7-8864-c656375ba257-lib-modules\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149281 kubelet[2905]: I0127 04:47:26.149150 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1595a63a-3432-48e7-8864-c656375ba257-xtables-lock\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149281 kubelet[2905]: I0127 04:47:26.149171 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/1595a63a-3432-48e7-8864-c656375ba257-cni-net-dir\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149281 kubelet[2905]: I0127 04:47:26.149188 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/1595a63a-3432-48e7-8864-c656375ba257-cni-log-dir\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149281 kubelet[2905]: I0127 04:47:26.149204 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/1595a63a-3432-48e7-8864-c656375ba257-flexvol-driver-host\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149775 kubelet[2905]: I0127 04:47:26.149221 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/1595a63a-3432-48e7-8864-c656375ba257-node-certs\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149775 kubelet[2905]: I0127 04:47:26.149238 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkm8r\" (UniqueName: \"kubernetes.io/projected/1595a63a-3432-48e7-8864-c656375ba257-kube-api-access-dkm8r\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149775 kubelet[2905]: I0127 04:47:26.149320 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1595a63a-3432-48e7-8864-c656375ba257-tigera-ca-bundle\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149775 kubelet[2905]: I0127 04:47:26.149354 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/1595a63a-3432-48e7-8864-c656375ba257-var-run-calico\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149775 kubelet[2905]: I0127 04:47:26.149382 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/1595a63a-3432-48e7-8864-c656375ba257-policysync\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149878 kubelet[2905]: I0127 04:47:26.149402 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1595a63a-3432-48e7-8864-c656375ba257-var-lib-calico\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.149878 kubelet[2905]: I0127 04:47:26.149425 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/1595a63a-3432-48e7-8864-c656375ba257-cni-bin-dir\") pod \"calico-node-cb7q9\" (UID: \"1595a63a-3432-48e7-8864-c656375ba257\") " pod="calico-system/calico-node-cb7q9" Jan 27 04:47:26.181864 containerd[1666]: time="2026-01-27T04:47:26.181821342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5cc9894cdd-9qx49,Uid:00771dea-761d-442b-a1ec-f4169b13909e,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:26.215850 containerd[1666]: time="2026-01-27T04:47:26.215803396Z" level=info msg="connecting to shim e0ffdd08edb5b81625c06c103439214930ca46af42227a1650f8a4bf5716712f" address="unix:///run/containerd/s/3dd30a78d2dcbe1b633af62a4558a8de123dca779e4fcb61a137a758cc64301e" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:26.240504 systemd[1]: Started cri-containerd-e0ffdd08edb5b81625c06c103439214930ca46af42227a1650f8a4bf5716712f.scope - libcontainer container e0ffdd08edb5b81625c06c103439214930ca46af42227a1650f8a4bf5716712f. Jan 27 04:47:26.250582 kubelet[2905]: E0127 04:47:26.250553 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.251547 kubelet[2905]: W0127 04:47:26.251517 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.251878 kubelet[2905]: E0127 04:47:26.251826 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.252273 kubelet[2905]: E0127 04:47:26.252251 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.252427 kubelet[2905]: W0127 04:47:26.252356 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.252468 kubelet[2905]: E0127 04:47:26.252416 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.252718 kubelet[2905]: E0127 04:47:26.252618 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.252718 kubelet[2905]: W0127 04:47:26.252631 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.252718 kubelet[2905]: E0127 04:47:26.252655 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.252874 kubelet[2905]: E0127 04:47:26.252860 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.252954 kubelet[2905]: W0127 04:47:26.252942 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.253148 kubelet[2905]: E0127 04:47:26.253087 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.253437 kubelet[2905]: E0127 04:47:26.253301 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.253880 kubelet[2905]: W0127 04:47:26.253501 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.253880 kubelet[2905]: E0127 04:47:26.253678 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.254182 kubelet[2905]: E0127 04:47:26.254165 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.254308 kubelet[2905]: W0127 04:47:26.254236 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.254308 kubelet[2905]: E0127 04:47:26.254277 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.254605 kubelet[2905]: E0127 04:47:26.254509 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.254605 kubelet[2905]: W0127 04:47:26.254521 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.254605 kubelet[2905]: E0127 04:47:26.254548 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.254741 kubelet[2905]: E0127 04:47:26.254728 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.254801 kubelet[2905]: W0127 04:47:26.254790 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.254940 kubelet[2905]: E0127 04:47:26.254895 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.255163 kubelet[2905]: E0127 04:47:26.255149 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.255284 kubelet[2905]: W0127 04:47:26.255215 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.255284 kubelet[2905]: E0127 04:47:26.255266 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.255474 kubelet[2905]: E0127 04:47:26.255462 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.255538 kubelet[2905]: W0127 04:47:26.255527 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.255746 kubelet[2905]: E0127 04:47:26.255708 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.255952 kubelet[2905]: E0127 04:47:26.255875 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.255952 kubelet[2905]: W0127 04:47:26.255889 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.255952 kubelet[2905]: E0127 04:47:26.255940 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.255000 audit: BPF prog-id=154 op=LOAD Jan 27 04:47:26.256316 kubelet[2905]: E0127 04:47:26.256150 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.256316 kubelet[2905]: W0127 04:47:26.256163 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.256316 kubelet[2905]: E0127 04:47:26.256211 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.256569 kubelet[2905]: E0127 04:47:26.256447 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.256706 kubelet[2905]: W0127 04:47:26.256631 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.256790 kubelet[2905]: E0127 04:47:26.256766 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.256981 kubelet[2905]: E0127 04:47:26.256967 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.257216 kubelet[2905]: W0127 04:47:26.257005 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.257216 kubelet[2905]: E0127 04:47:26.257026 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.257320 kubelet[2905]: E0127 04:47:26.257307 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.257382 kubelet[2905]: W0127 04:47:26.257371 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.257447 kubelet[2905]: E0127 04:47:26.257429 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.257683 kubelet[2905]: E0127 04:47:26.257665 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.257776 kubelet[2905]: W0127 04:47:26.257683 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.257776 kubelet[2905]: E0127 04:47:26.257705 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.257911 kubelet[2905]: E0127 04:47:26.257896 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.257947 kubelet[2905]: W0127 04:47:26.257913 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.257947 kubelet[2905]: E0127 04:47:26.257922 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.258713 kubelet[2905]: E0127 04:47:26.258693 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.258713 kubelet[2905]: W0127 04:47:26.258712 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.258781 kubelet[2905]: E0127 04:47:26.258725 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.256000 audit: BPF prog-id=155 op=LOAD Jan 27 04:47:26.256000 audit[3355]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3344 pid=3355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530666664643038656462356238313632356330366331303334333932 Jan 27 04:47:26.257000 audit: BPF prog-id=155 op=UNLOAD Jan 27 04:47:26.257000 audit[3355]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3344 pid=3355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.257000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530666664643038656462356238313632356330366331303334333932 Jan 27 04:47:26.257000 audit: BPF prog-id=156 op=LOAD Jan 27 04:47:26.257000 audit[3355]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3344 pid=3355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.257000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530666664643038656462356238313632356330366331303334333932 Jan 27 04:47:26.257000 audit: BPF prog-id=157 op=LOAD Jan 27 04:47:26.257000 audit[3355]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3344 pid=3355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.257000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530666664643038656462356238313632356330366331303334333932 Jan 27 04:47:26.257000 audit: BPF prog-id=157 op=UNLOAD Jan 27 04:47:26.257000 audit[3355]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3344 pid=3355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.257000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530666664643038656462356238313632356330366331303334333932 Jan 27 04:47:26.257000 audit: BPF prog-id=156 op=UNLOAD Jan 27 04:47:26.257000 audit[3355]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3344 pid=3355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.257000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530666664643038656462356238313632356330366331303334333932 Jan 27 04:47:26.257000 audit: BPF prog-id=158 op=LOAD Jan 27 04:47:26.257000 audit[3355]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3344 pid=3355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.257000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6530666664643038656462356238313632356330366331303334333932 Jan 27 04:47:26.263766 kubelet[2905]: E0127 04:47:26.263411 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.263766 kubelet[2905]: W0127 04:47:26.263430 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.263766 kubelet[2905]: E0127 04:47:26.263445 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.285869 containerd[1666]: time="2026-01-27T04:47:26.285752392Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5cc9894cdd-9qx49,Uid:00771dea-761d-442b-a1ec-f4169b13909e,Namespace:calico-system,Attempt:0,} returns sandbox id \"e0ffdd08edb5b81625c06c103439214930ca46af42227a1650f8a4bf5716712f\"" Jan 27 04:47:26.287420 containerd[1666]: time="2026-01-27T04:47:26.287383841Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 27 04:47:26.365947 containerd[1666]: time="2026-01-27T04:47:26.365899721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cb7q9,Uid:1595a63a-3432-48e7-8864-c656375ba257,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:26.366519 kubelet[2905]: E0127 04:47:26.366468 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:47:26.438581 containerd[1666]: time="2026-01-27T04:47:26.438322331Z" level=info msg="connecting to shim bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1" address="unix:///run/containerd/s/18f7b974747ceab1977f7f6c0e7380928d9b901831411d9f96d20341dcf7317a" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:26.446291 kubelet[2905]: E0127 04:47:26.446254 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.446291 kubelet[2905]: W0127 04:47:26.446277 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.446291 kubelet[2905]: E0127 04:47:26.446299 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.446511 kubelet[2905]: E0127 04:47:26.446442 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.446511 kubelet[2905]: W0127 04:47:26.446451 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.446511 kubelet[2905]: E0127 04:47:26.446503 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.446675 kubelet[2905]: E0127 04:47:26.446649 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.446675 kubelet[2905]: W0127 04:47:26.446660 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.446675 kubelet[2905]: E0127 04:47:26.446672 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.446878 kubelet[2905]: E0127 04:47:26.446807 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.446878 kubelet[2905]: W0127 04:47:26.446813 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.446878 kubelet[2905]: E0127 04:47:26.446821 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.447041 kubelet[2905]: E0127 04:47:26.447021 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.447041 kubelet[2905]: W0127 04:47:26.447033 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.447041 kubelet[2905]: E0127 04:47:26.447041 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.447219 kubelet[2905]: E0127 04:47:26.447201 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.447219 kubelet[2905]: W0127 04:47:26.447213 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.447219 kubelet[2905]: E0127 04:47:26.447221 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.447451 kubelet[2905]: E0127 04:47:26.447397 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.447451 kubelet[2905]: W0127 04:47:26.447409 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.447451 kubelet[2905]: E0127 04:47:26.447425 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.448143 kubelet[2905]: E0127 04:47:26.448121 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.448143 kubelet[2905]: W0127 04:47:26.448137 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.448237 kubelet[2905]: E0127 04:47:26.448149 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.448412 kubelet[2905]: E0127 04:47:26.448378 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.448412 kubelet[2905]: W0127 04:47:26.448393 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.448412 kubelet[2905]: E0127 04:47:26.448403 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.448595 kubelet[2905]: E0127 04:47:26.448574 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.448595 kubelet[2905]: W0127 04:47:26.448584 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.448646 kubelet[2905]: E0127 04:47:26.448598 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.448763 kubelet[2905]: E0127 04:47:26.448739 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.448763 kubelet[2905]: W0127 04:47:26.448757 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.448763 kubelet[2905]: E0127 04:47:26.448765 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.448953 kubelet[2905]: E0127 04:47:26.448934 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.448953 kubelet[2905]: W0127 04:47:26.448946 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.449005 kubelet[2905]: E0127 04:47:26.448955 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.449134 kubelet[2905]: E0127 04:47:26.449116 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.449134 kubelet[2905]: W0127 04:47:26.449129 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.449200 kubelet[2905]: E0127 04:47:26.449137 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.449305 kubelet[2905]: E0127 04:47:26.449285 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.449305 kubelet[2905]: W0127 04:47:26.449295 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.449363 kubelet[2905]: E0127 04:47:26.449310 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.449473 kubelet[2905]: E0127 04:47:26.449446 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.449473 kubelet[2905]: W0127 04:47:26.449464 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.449473 kubelet[2905]: E0127 04:47:26.449472 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.449640 kubelet[2905]: E0127 04:47:26.449622 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.449675 kubelet[2905]: W0127 04:47:26.449651 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.449675 kubelet[2905]: E0127 04:47:26.449660 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.449997 kubelet[2905]: E0127 04:47:26.449973 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.449997 kubelet[2905]: W0127 04:47:26.449988 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.450077 kubelet[2905]: E0127 04:47:26.450010 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.450270 kubelet[2905]: E0127 04:47:26.450251 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.450270 kubelet[2905]: W0127 04:47:26.450265 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.450336 kubelet[2905]: E0127 04:47:26.450280 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.450471 kubelet[2905]: E0127 04:47:26.450454 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.450471 kubelet[2905]: W0127 04:47:26.450467 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.450542 kubelet[2905]: E0127 04:47:26.450481 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.450673 kubelet[2905]: E0127 04:47:26.450651 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.450673 kubelet[2905]: W0127 04:47:26.450663 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.450673 kubelet[2905]: E0127 04:47:26.450672 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.451638 kubelet[2905]: E0127 04:47:26.450988 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.451638 kubelet[2905]: W0127 04:47:26.451002 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.451638 kubelet[2905]: E0127 04:47:26.451011 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.451638 kubelet[2905]: I0127 04:47:26.451043 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/56e3e6d0-7a6b-4ba3-9081-3231ea811709-kubelet-dir\") pod \"csi-node-driver-vk94b\" (UID: \"56e3e6d0-7a6b-4ba3-9081-3231ea811709\") " pod="calico-system/csi-node-driver-vk94b" Jan 27 04:47:26.451638 kubelet[2905]: E0127 04:47:26.451239 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.451638 kubelet[2905]: W0127 04:47:26.451248 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.451638 kubelet[2905]: E0127 04:47:26.451267 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.451638 kubelet[2905]: I0127 04:47:26.451281 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/56e3e6d0-7a6b-4ba3-9081-3231ea811709-varrun\") pod \"csi-node-driver-vk94b\" (UID: \"56e3e6d0-7a6b-4ba3-9081-3231ea811709\") " pod="calico-system/csi-node-driver-vk94b" Jan 27 04:47:26.451638 kubelet[2905]: E0127 04:47:26.451445 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.451869 kubelet[2905]: W0127 04:47:26.451466 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.451869 kubelet[2905]: E0127 04:47:26.451481 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.451869 kubelet[2905]: I0127 04:47:26.451496 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/56e3e6d0-7a6b-4ba3-9081-3231ea811709-socket-dir\") pod \"csi-node-driver-vk94b\" (UID: \"56e3e6d0-7a6b-4ba3-9081-3231ea811709\") " pod="calico-system/csi-node-driver-vk94b" Jan 27 04:47:26.452104 kubelet[2905]: E0127 04:47:26.452071 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.452163 kubelet[2905]: W0127 04:47:26.452087 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.452226 kubelet[2905]: E0127 04:47:26.452208 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.452256 kubelet[2905]: I0127 04:47:26.452230 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsdqj\" (UniqueName: \"kubernetes.io/projected/56e3e6d0-7a6b-4ba3-9081-3231ea811709-kube-api-access-qsdqj\") pod \"csi-node-driver-vk94b\" (UID: \"56e3e6d0-7a6b-4ba3-9081-3231ea811709\") " pod="calico-system/csi-node-driver-vk94b" Jan 27 04:47:26.452564 kubelet[2905]: E0127 04:47:26.452543 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.452564 kubelet[2905]: W0127 04:47:26.452559 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.452640 kubelet[2905]: E0127 04:47:26.452622 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.452768 kubelet[2905]: I0127 04:47:26.452652 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/56e3e6d0-7a6b-4ba3-9081-3231ea811709-registration-dir\") pod \"csi-node-driver-vk94b\" (UID: \"56e3e6d0-7a6b-4ba3-9081-3231ea811709\") " pod="calico-system/csi-node-driver-vk94b" Jan 27 04:47:26.452829 kubelet[2905]: E0127 04:47:26.452809 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.452829 kubelet[2905]: W0127 04:47:26.452824 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.452902 kubelet[2905]: E0127 04:47:26.452850 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.453058 kubelet[2905]: E0127 04:47:26.453040 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.453058 kubelet[2905]: W0127 04:47:26.453055 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.453155 kubelet[2905]: E0127 04:47:26.453128 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.453349 kubelet[2905]: E0127 04:47:26.453324 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.453349 kubelet[2905]: W0127 04:47:26.453339 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.453402 kubelet[2905]: E0127 04:47:26.453367 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.453846 kubelet[2905]: E0127 04:47:26.453745 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.453846 kubelet[2905]: W0127 04:47:26.453761 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.453846 kubelet[2905]: E0127 04:47:26.453777 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.453974 kubelet[2905]: E0127 04:47:26.453952 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.453974 kubelet[2905]: W0127 04:47:26.453969 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.454033 kubelet[2905]: E0127 04:47:26.453986 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.454195 kubelet[2905]: E0127 04:47:26.454180 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.454195 kubelet[2905]: W0127 04:47:26.454192 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.454248 kubelet[2905]: E0127 04:47:26.454203 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.454595 kubelet[2905]: E0127 04:47:26.454542 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.454595 kubelet[2905]: W0127 04:47:26.454559 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.454595 kubelet[2905]: E0127 04:47:26.454571 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.454919 kubelet[2905]: E0127 04:47:26.454900 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.454919 kubelet[2905]: W0127 04:47:26.454916 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.455166 kubelet[2905]: E0127 04:47:26.454929 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.455879 kubelet[2905]: E0127 04:47:26.455818 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.455879 kubelet[2905]: W0127 04:47:26.455861 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.456841 kubelet[2905]: E0127 04:47:26.455877 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.457168 kubelet[2905]: E0127 04:47:26.457146 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.457168 kubelet[2905]: W0127 04:47:26.457167 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.457237 kubelet[2905]: E0127 04:47:26.457182 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.463536 systemd[1]: Started cri-containerd-bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1.scope - libcontainer container bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1. Jan 27 04:47:26.473000 audit: BPF prog-id=159 op=LOAD Jan 27 04:47:26.474000 audit: BPF prog-id=160 op=LOAD Jan 27 04:47:26.474000 audit[3433]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3422 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.474000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263393139343734306362613465633664383261336464643162333330 Jan 27 04:47:26.474000 audit: BPF prog-id=160 op=UNLOAD Jan 27 04:47:26.474000 audit[3433]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3422 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.474000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263393139343734306362613465633664383261336464643162333330 Jan 27 04:47:26.475000 audit: BPF prog-id=161 op=LOAD Jan 27 04:47:26.475000 audit[3433]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3422 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.475000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263393139343734306362613465633664383261336464643162333330 Jan 27 04:47:26.475000 audit: BPF prog-id=162 op=LOAD Jan 27 04:47:26.475000 audit[3433]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3422 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.475000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263393139343734306362613465633664383261336464643162333330 Jan 27 04:47:26.475000 audit: BPF prog-id=162 op=UNLOAD Jan 27 04:47:26.475000 audit[3433]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3422 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.475000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263393139343734306362613465633664383261336464643162333330 Jan 27 04:47:26.475000 audit: BPF prog-id=161 op=UNLOAD Jan 27 04:47:26.475000 audit[3433]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3422 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.475000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263393139343734306362613465633664383261336464643162333330 Jan 27 04:47:26.475000 audit: BPF prog-id=163 op=LOAD Jan 27 04:47:26.475000 audit[3433]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3422 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.475000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263393139343734306362613465633664383261336464643162333330 Jan 27 04:47:26.504342 containerd[1666]: time="2026-01-27T04:47:26.504300907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cb7q9,Uid:1595a63a-3432-48e7-8864-c656375ba257,Namespace:calico-system,Attempt:0,} returns sandbox id \"bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1\"" Jan 27 04:47:26.553370 kubelet[2905]: E0127 04:47:26.553336 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.553370 kubelet[2905]: W0127 04:47:26.553359 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.553370 kubelet[2905]: E0127 04:47:26.553380 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.553693 kubelet[2905]: E0127 04:47:26.553678 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.553693 kubelet[2905]: W0127 04:47:26.553691 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.553748 kubelet[2905]: E0127 04:47:26.553711 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.553908 kubelet[2905]: E0127 04:47:26.553891 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.553952 kubelet[2905]: W0127 04:47:26.553909 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.553952 kubelet[2905]: E0127 04:47:26.553929 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.554069 kubelet[2905]: E0127 04:47:26.554054 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.554069 kubelet[2905]: W0127 04:47:26.554064 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.554186 kubelet[2905]: E0127 04:47:26.554078 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.554234 kubelet[2905]: E0127 04:47:26.554222 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.554234 kubelet[2905]: W0127 04:47:26.554232 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.554295 kubelet[2905]: E0127 04:47:26.554245 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.554407 kubelet[2905]: E0127 04:47:26.554395 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.554407 kubelet[2905]: W0127 04:47:26.554405 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.554479 kubelet[2905]: E0127 04:47:26.554418 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.555803 kubelet[2905]: E0127 04:47:26.554881 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.555803 kubelet[2905]: W0127 04:47:26.555705 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.555803 kubelet[2905]: E0127 04:47:26.555746 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.555962 kubelet[2905]: E0127 04:47:26.555937 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.555962 kubelet[2905]: W0127 04:47:26.555952 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.555962 kubelet[2905]: E0127 04:47:26.555969 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.556145 kubelet[2905]: E0127 04:47:26.556134 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.556145 kubelet[2905]: W0127 04:47:26.556144 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.556230 kubelet[2905]: E0127 04:47:26.556158 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.556292 kubelet[2905]: E0127 04:47:26.556280 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.556292 kubelet[2905]: W0127 04:47:26.556289 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.556342 kubelet[2905]: E0127 04:47:26.556304 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.556496 kubelet[2905]: E0127 04:47:26.556479 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.556496 kubelet[2905]: W0127 04:47:26.556489 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.556556 kubelet[2905]: E0127 04:47:26.556502 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.556661 kubelet[2905]: E0127 04:47:26.556636 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.556661 kubelet[2905]: W0127 04:47:26.556646 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.556661 kubelet[2905]: E0127 04:47:26.556667 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.556841 kubelet[2905]: E0127 04:47:26.556828 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.556841 kubelet[2905]: W0127 04:47:26.556839 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.556894 kubelet[2905]: E0127 04:47:26.556852 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.557119 kubelet[2905]: E0127 04:47:26.557085 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.557119 kubelet[2905]: W0127 04:47:26.557118 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.557179 kubelet[2905]: E0127 04:47:26.557138 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.557306 kubelet[2905]: E0127 04:47:26.557295 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.557306 kubelet[2905]: W0127 04:47:26.557306 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.557535 kubelet[2905]: E0127 04:47:26.557321 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.557535 kubelet[2905]: E0127 04:47:26.557504 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.557535 kubelet[2905]: W0127 04:47:26.557513 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.557535 kubelet[2905]: E0127 04:47:26.557527 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.557719 kubelet[2905]: E0127 04:47:26.557695 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.557719 kubelet[2905]: W0127 04:47:26.557708 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.557802 kubelet[2905]: E0127 04:47:26.557725 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.557903 kubelet[2905]: E0127 04:47:26.557888 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.557903 kubelet[2905]: W0127 04:47:26.557897 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.557979 kubelet[2905]: E0127 04:47:26.557906 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.558047 kubelet[2905]: E0127 04:47:26.558034 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.558047 kubelet[2905]: W0127 04:47:26.558045 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.558147 kubelet[2905]: E0127 04:47:26.558128 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.558195 kubelet[2905]: E0127 04:47:26.558179 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.558195 kubelet[2905]: W0127 04:47:26.558189 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.558252 kubelet[2905]: E0127 04:47:26.558198 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.558330 kubelet[2905]: E0127 04:47:26.558316 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.558330 kubelet[2905]: W0127 04:47:26.558326 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.558403 kubelet[2905]: E0127 04:47:26.558339 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.558467 kubelet[2905]: E0127 04:47:26.558455 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.558467 kubelet[2905]: W0127 04:47:26.558465 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.558510 kubelet[2905]: E0127 04:47:26.558487 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.558709 kubelet[2905]: E0127 04:47:26.558693 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.558709 kubelet[2905]: W0127 04:47:26.558708 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.558764 kubelet[2905]: E0127 04:47:26.558727 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.558927 kubelet[2905]: E0127 04:47:26.558913 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.558927 kubelet[2905]: W0127 04:47:26.558924 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.558977 kubelet[2905]: E0127 04:47:26.558937 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.559116 kubelet[2905]: E0127 04:47:26.559097 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.559116 kubelet[2905]: W0127 04:47:26.559109 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.559178 kubelet[2905]: E0127 04:47:26.559117 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.565917 kubelet[2905]: E0127 04:47:26.565872 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:26.565917 kubelet[2905]: W0127 04:47:26.565902 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:26.565917 kubelet[2905]: E0127 04:47:26.565922 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:26.857000 audit[3521]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=3521 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:26.857000 audit[3521]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffff4fce540 a2=0 a3=1 items=0 ppid=3073 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.857000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:26.864000 audit[3521]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=3521 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:26.864000 audit[3521]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff4fce540 a2=0 a3=1 items=0 ppid=3073 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:26.864000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:27.825780 kubelet[2905]: E0127 04:47:27.825525 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:47:27.861890 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount385876254.mount: Deactivated successfully. Jan 27 04:47:28.865366 containerd[1666]: time="2026-01-27T04:47:28.865252952Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:28.867852 containerd[1666]: time="2026-01-27T04:47:28.867795045Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Jan 27 04:47:28.873483 containerd[1666]: time="2026-01-27T04:47:28.873393714Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:28.893323 containerd[1666]: time="2026-01-27T04:47:28.893213855Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:28.893912 containerd[1666]: time="2026-01-27T04:47:28.893855618Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.606327897s" Jan 27 04:47:28.893912 containerd[1666]: time="2026-01-27T04:47:28.893900298Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 27 04:47:28.895066 containerd[1666]: time="2026-01-27T04:47:28.894987224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 27 04:47:28.905684 containerd[1666]: time="2026-01-27T04:47:28.904763154Z" level=info msg="CreateContainer within sandbox \"e0ffdd08edb5b81625c06c103439214930ca46af42227a1650f8a4bf5716712f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 27 04:47:28.914850 containerd[1666]: time="2026-01-27T04:47:28.913876520Z" level=info msg="Container b35402da1c710b583d525075917bdd8133719193513e10b0ebde64b712b5ae93: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:47:28.930850 containerd[1666]: time="2026-01-27T04:47:28.930731726Z" level=info msg="CreateContainer within sandbox \"e0ffdd08edb5b81625c06c103439214930ca46af42227a1650f8a4bf5716712f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b35402da1c710b583d525075917bdd8133719193513e10b0ebde64b712b5ae93\"" Jan 27 04:47:28.932799 containerd[1666]: time="2026-01-27T04:47:28.931469370Z" level=info msg="StartContainer for \"b35402da1c710b583d525075917bdd8133719193513e10b0ebde64b712b5ae93\"" Jan 27 04:47:28.932799 containerd[1666]: time="2026-01-27T04:47:28.932484175Z" level=info msg="connecting to shim b35402da1c710b583d525075917bdd8133719193513e10b0ebde64b712b5ae93" address="unix:///run/containerd/s/3dd30a78d2dcbe1b633af62a4558a8de123dca779e4fcb61a137a758cc64301e" protocol=ttrpc version=3 Jan 27 04:47:28.950270 systemd[1]: Started cri-containerd-b35402da1c710b583d525075917bdd8133719193513e10b0ebde64b712b5ae93.scope - libcontainer container b35402da1c710b583d525075917bdd8133719193513e10b0ebde64b712b5ae93. Jan 27 04:47:28.959000 audit: BPF prog-id=164 op=LOAD Jan 27 04:47:28.961475 kernel: kauditd_printk_skb: 64 callbacks suppressed Jan 27 04:47:28.961523 kernel: audit: type=1334 audit(1769489248.959:566): prog-id=164 op=LOAD Jan 27 04:47:28.961000 audit: BPF prog-id=165 op=LOAD Jan 27 04:47:28.961000 audit[3532]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.966533 kernel: audit: type=1334 audit(1769489248.961:567): prog-id=165 op=LOAD Jan 27 04:47:28.966576 kernel: audit: type=1300 audit(1769489248.961:567): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.966748 kernel: audit: type=1327 audit(1769489248.961:567): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:28.961000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:28.961000 audit: BPF prog-id=165 op=UNLOAD Jan 27 04:47:28.971527 kernel: audit: type=1334 audit(1769489248.961:568): prog-id=165 op=UNLOAD Jan 27 04:47:28.961000 audit[3532]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.974652 kernel: audit: type=1300 audit(1769489248.961:568): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.975324 kernel: audit: type=1327 audit(1769489248.961:568): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:28.961000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:28.961000 audit: BPF prog-id=166 op=LOAD Jan 27 04:47:28.978443 kernel: audit: type=1334 audit(1769489248.961:569): prog-id=166 op=LOAD Jan 27 04:47:28.978536 kernel: audit: type=1300 audit(1769489248.961:569): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.961000 audit[3532]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.961000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:28.984401 kernel: audit: type=1327 audit(1769489248.961:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:28.961000 audit: BPF prog-id=167 op=LOAD Jan 27 04:47:28.961000 audit[3532]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.961000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:28.961000 audit: BPF prog-id=167 op=UNLOAD Jan 27 04:47:28.961000 audit[3532]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.961000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:28.961000 audit: BPF prog-id=166 op=UNLOAD Jan 27 04:47:28.961000 audit[3532]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.961000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:28.961000 audit: BPF prog-id=168 op=LOAD Jan 27 04:47:28.961000 audit[3532]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3344 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:28.961000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233353430326461316337313062353833643532353037353931376264 Jan 27 04:47:29.005723 containerd[1666]: time="2026-01-27T04:47:29.005684509Z" level=info msg="StartContainer for \"b35402da1c710b583d525075917bdd8133719193513e10b0ebde64b712b5ae93\" returns successfully" Jan 27 04:47:29.826039 kubelet[2905]: E0127 04:47:29.825187 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:47:29.972617 kubelet[2905]: E0127 04:47:29.972527 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.972617 kubelet[2905]: W0127 04:47:29.972554 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.972617 kubelet[2905]: E0127 04:47:29.972577 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.973221 kubelet[2905]: E0127 04:47:29.973050 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.973221 kubelet[2905]: W0127 04:47:29.973065 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.973221 kubelet[2905]: E0127 04:47:29.973130 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.973403 kubelet[2905]: E0127 04:47:29.973388 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.973574 kubelet[2905]: W0127 04:47:29.973456 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.973574 kubelet[2905]: E0127 04:47:29.973473 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.973714 kubelet[2905]: E0127 04:47:29.973700 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.973874 kubelet[2905]: W0127 04:47:29.973762 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.973874 kubelet[2905]: E0127 04:47:29.973778 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.974007 kubelet[2905]: E0127 04:47:29.973993 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.974173 kubelet[2905]: W0127 04:47:29.974056 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.974173 kubelet[2905]: E0127 04:47:29.974072 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.974306 kubelet[2905]: E0127 04:47:29.974293 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.974446 kubelet[2905]: W0127 04:47:29.974349 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.974446 kubelet[2905]: E0127 04:47:29.974363 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.974564 kubelet[2905]: E0127 04:47:29.974552 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.974627 kubelet[2905]: W0127 04:47:29.974614 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.974681 kubelet[2905]: E0127 04:47:29.974670 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.974973 kubelet[2905]: E0127 04:47:29.974862 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.974973 kubelet[2905]: W0127 04:47:29.974874 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.974973 kubelet[2905]: E0127 04:47:29.974886 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.975138 kubelet[2905]: E0127 04:47:29.975125 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.975207 kubelet[2905]: W0127 04:47:29.975195 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.975261 kubelet[2905]: E0127 04:47:29.975251 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.975520 kubelet[2905]: E0127 04:47:29.975432 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.975520 kubelet[2905]: W0127 04:47:29.975444 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.975520 kubelet[2905]: E0127 04:47:29.975454 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.975670 kubelet[2905]: E0127 04:47:29.975658 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.975829 kubelet[2905]: W0127 04:47:29.975720 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.975829 kubelet[2905]: E0127 04:47:29.975735 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.975953 kubelet[2905]: E0127 04:47:29.975941 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.976008 kubelet[2905]: W0127 04:47:29.975997 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.976163 kubelet[2905]: E0127 04:47:29.976054 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.976271 kubelet[2905]: E0127 04:47:29.976259 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.976423 kubelet[2905]: W0127 04:47:29.976321 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.976423 kubelet[2905]: E0127 04:47:29.976337 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.976548 kubelet[2905]: E0127 04:47:29.976536 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.976604 kubelet[2905]: W0127 04:47:29.976594 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.976656 kubelet[2905]: E0127 04:47:29.976646 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.977175 kubelet[2905]: E0127 04:47:29.977003 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.977175 kubelet[2905]: W0127 04:47:29.977017 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.977175 kubelet[2905]: E0127 04:47:29.977028 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.981515 kubelet[2905]: E0127 04:47:29.981451 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.981515 kubelet[2905]: W0127 04:47:29.981466 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.981515 kubelet[2905]: E0127 04:47:29.981479 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.981669 kubelet[2905]: E0127 04:47:29.981650 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.981669 kubelet[2905]: W0127 04:47:29.981663 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.981669 kubelet[2905]: E0127 04:47:29.981677 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.981962 kubelet[2905]: E0127 04:47:29.981943 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.982023 kubelet[2905]: W0127 04:47:29.982010 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.982115 kubelet[2905]: E0127 04:47:29.982085 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.982310 kubelet[2905]: E0127 04:47:29.982264 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.982310 kubelet[2905]: W0127 04:47:29.982284 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.982310 kubelet[2905]: E0127 04:47:29.982301 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.982493 kubelet[2905]: E0127 04:47:29.982479 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.982493 kubelet[2905]: W0127 04:47:29.982491 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.982575 kubelet[2905]: E0127 04:47:29.982506 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.982640 kubelet[2905]: E0127 04:47:29.982628 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.982640 kubelet[2905]: W0127 04:47:29.982638 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.982693 kubelet[2905]: E0127 04:47:29.982651 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.982788 kubelet[2905]: E0127 04:47:29.982777 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.982827 kubelet[2905]: W0127 04:47:29.982787 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.982827 kubelet[2905]: E0127 04:47:29.982812 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.982928 kubelet[2905]: E0127 04:47:29.982917 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.982971 kubelet[2905]: W0127 04:47:29.982927 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.982971 kubelet[2905]: E0127 04:47:29.982947 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.983062 kubelet[2905]: E0127 04:47:29.983051 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.983062 kubelet[2905]: W0127 04:47:29.983061 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.983133 kubelet[2905]: E0127 04:47:29.983074 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.983282 kubelet[2905]: E0127 04:47:29.983240 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.983282 kubelet[2905]: W0127 04:47:29.983253 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.983282 kubelet[2905]: E0127 04:47:29.983265 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.983425 kubelet[2905]: E0127 04:47:29.983413 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.983425 kubelet[2905]: W0127 04:47:29.983424 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.983477 kubelet[2905]: E0127 04:47:29.983438 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.983680 kubelet[2905]: E0127 04:47:29.983664 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.983718 kubelet[2905]: W0127 04:47:29.983680 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.983718 kubelet[2905]: E0127 04:47:29.983696 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.983859 kubelet[2905]: E0127 04:47:29.983846 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.983891 kubelet[2905]: W0127 04:47:29.983860 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.983891 kubelet[2905]: E0127 04:47:29.983876 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.984052 kubelet[2905]: E0127 04:47:29.984039 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.984086 kubelet[2905]: W0127 04:47:29.984052 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.984086 kubelet[2905]: E0127 04:47:29.984069 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.984245 kubelet[2905]: E0127 04:47:29.984233 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.984276 kubelet[2905]: W0127 04:47:29.984246 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.984276 kubelet[2905]: E0127 04:47:29.984257 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.984399 kubelet[2905]: E0127 04:47:29.984389 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.984434 kubelet[2905]: W0127 04:47:29.984399 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.984434 kubelet[2905]: E0127 04:47:29.984408 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.984579 kubelet[2905]: E0127 04:47:29.984568 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.984579 kubelet[2905]: W0127 04:47:29.984579 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.984631 kubelet[2905]: E0127 04:47:29.984587 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:29.984845 kubelet[2905]: E0127 04:47:29.984833 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 27 04:47:29.984878 kubelet[2905]: W0127 04:47:29.984846 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 27 04:47:29.984878 kubelet[2905]: E0127 04:47:29.984855 2905 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 27 04:47:30.199566 containerd[1666]: time="2026-01-27T04:47:30.199229638Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:30.201956 containerd[1666]: time="2026-01-27T04:47:30.201895811Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:30.203963 containerd[1666]: time="2026-01-27T04:47:30.203894142Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:30.208527 containerd[1666]: time="2026-01-27T04:47:30.208418325Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:30.209551 containerd[1666]: time="2026-01-27T04:47:30.209521450Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.314500266s" Jan 27 04:47:30.209746 containerd[1666]: time="2026-01-27T04:47:30.209644011Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 27 04:47:30.212156 containerd[1666]: time="2026-01-27T04:47:30.212113543Z" level=info msg="CreateContainer within sandbox \"bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 27 04:47:30.229711 containerd[1666]: time="2026-01-27T04:47:30.229661193Z" level=info msg="Container f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:47:30.245855 containerd[1666]: time="2026-01-27T04:47:30.245773115Z" level=info msg="CreateContainer within sandbox \"bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c\"" Jan 27 04:47:30.248119 containerd[1666]: time="2026-01-27T04:47:30.248014087Z" level=info msg="StartContainer for \"f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c\"" Jan 27 04:47:30.250392 containerd[1666]: time="2026-01-27T04:47:30.250361379Z" level=info msg="connecting to shim f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c" address="unix:///run/containerd/s/18f7b974747ceab1977f7f6c0e7380928d9b901831411d9f96d20341dcf7317a" protocol=ttrpc version=3 Jan 27 04:47:30.276304 systemd[1]: Started cri-containerd-f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c.scope - libcontainer container f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c. Jan 27 04:47:30.340000 audit: BPF prog-id=169 op=LOAD Jan 27 04:47:30.340000 audit[3610]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3422 pid=3610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:30.340000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631616636613933343862643863636165653635343532386538313561 Jan 27 04:47:30.340000 audit: BPF prog-id=170 op=LOAD Jan 27 04:47:30.340000 audit[3610]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3422 pid=3610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:30.340000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631616636613933343862643863636165653635343532386538313561 Jan 27 04:47:30.341000 audit: BPF prog-id=170 op=UNLOAD Jan 27 04:47:30.341000 audit[3610]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3422 pid=3610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:30.341000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631616636613933343862643863636165653635343532386538313561 Jan 27 04:47:30.341000 audit: BPF prog-id=169 op=UNLOAD Jan 27 04:47:30.341000 audit[3610]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3422 pid=3610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:30.341000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631616636613933343862643863636165653635343532386538313561 Jan 27 04:47:30.341000 audit: BPF prog-id=171 op=LOAD Jan 27 04:47:30.341000 audit[3610]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3422 pid=3610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:30.341000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631616636613933343862643863636165653635343532386538313561 Jan 27 04:47:30.361422 containerd[1666]: time="2026-01-27T04:47:30.361370305Z" level=info msg="StartContainer for \"f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c\" returns successfully" Jan 27 04:47:30.375408 systemd[1]: cri-containerd-f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c.scope: Deactivated successfully. Jan 27 04:47:30.378774 containerd[1666]: time="2026-01-27T04:47:30.378568633Z" level=info msg="received container exit event container_id:\"f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c\" id:\"f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c\" pid:3624 exited_at:{seconds:1769489250 nanos:378199871}" Jan 27 04:47:30.379000 audit: BPF prog-id=171 op=UNLOAD Jan 27 04:47:30.400523 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f1af6a9348bd8ccaee654528e815aa84984b9c84592aaee923c7ca124de96e5c-rootfs.mount: Deactivated successfully. Jan 27 04:47:30.912393 kubelet[2905]: I0127 04:47:30.912364 2905 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 04:47:30.913512 containerd[1666]: time="2026-01-27T04:47:30.913478962Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 27 04:47:30.928919 kubelet[2905]: I0127 04:47:30.928795 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5cc9894cdd-9qx49" podStartSLOduration=3.321327098 podStartE2EDuration="5.92877748s" podCreationTimestamp="2026-01-27 04:47:25 +0000 UTC" firstStartedPulling="2026-01-27 04:47:26.28716416 +0000 UTC m=+22.544775613" lastFinishedPulling="2026-01-27 04:47:28.894614582 +0000 UTC m=+25.152225995" observedRunningTime="2026-01-27 04:47:29.921960903 +0000 UTC m=+26.179572356" watchObservedRunningTime="2026-01-27 04:47:30.92877748 +0000 UTC m=+27.186388933" Jan 27 04:47:31.825808 kubelet[2905]: E0127 04:47:31.825377 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:47:33.045294 containerd[1666]: time="2026-01-27T04:47:33.045250437Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:33.047626 containerd[1666]: time="2026-01-27T04:47:33.047577609Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 27 04:47:33.048855 containerd[1666]: time="2026-01-27T04:47:33.048829015Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:33.051401 containerd[1666]: time="2026-01-27T04:47:33.051353948Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:33.052182 containerd[1666]: time="2026-01-27T04:47:33.052159592Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.13864403s" Jan 27 04:47:33.052347 containerd[1666]: time="2026-01-27T04:47:33.052256113Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 27 04:47:33.054669 containerd[1666]: time="2026-01-27T04:47:33.054260803Z" level=info msg="CreateContainer within sandbox \"bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 27 04:47:33.066470 containerd[1666]: time="2026-01-27T04:47:33.066411105Z" level=info msg="Container 288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:47:33.078529 containerd[1666]: time="2026-01-27T04:47:33.078475967Z" level=info msg="CreateContainer within sandbox \"bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0\"" Jan 27 04:47:33.079000 containerd[1666]: time="2026-01-27T04:47:33.078973169Z" level=info msg="StartContainer for \"288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0\"" Jan 27 04:47:33.080707 containerd[1666]: time="2026-01-27T04:47:33.080670098Z" level=info msg="connecting to shim 288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0" address="unix:///run/containerd/s/18f7b974747ceab1977f7f6c0e7380928d9b901831411d9f96d20341dcf7317a" protocol=ttrpc version=3 Jan 27 04:47:33.103496 systemd[1]: Started cri-containerd-288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0.scope - libcontainer container 288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0. Jan 27 04:47:33.166000 audit: BPF prog-id=172 op=LOAD Jan 27 04:47:33.166000 audit[3672]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3422 pid=3672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:33.166000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238386166316463643766313061373731336131383564323931613033 Jan 27 04:47:33.166000 audit: BPF prog-id=173 op=LOAD Jan 27 04:47:33.166000 audit[3672]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3422 pid=3672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:33.166000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238386166316463643766313061373731336131383564323931613033 Jan 27 04:47:33.166000 audit: BPF prog-id=173 op=UNLOAD Jan 27 04:47:33.166000 audit[3672]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3422 pid=3672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:33.166000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238386166316463643766313061373731336131383564323931613033 Jan 27 04:47:33.166000 audit: BPF prog-id=172 op=UNLOAD Jan 27 04:47:33.166000 audit[3672]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3422 pid=3672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:33.166000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238386166316463643766313061373731336131383564323931613033 Jan 27 04:47:33.166000 audit: BPF prog-id=174 op=LOAD Jan 27 04:47:33.166000 audit[3672]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3422 pid=3672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:33.166000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3238386166316463643766313061373731336131383564323931613033 Jan 27 04:47:33.198470 containerd[1666]: time="2026-01-27T04:47:33.198393978Z" level=info msg="StartContainer for \"288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0\" returns successfully" Jan 27 04:47:33.605270 containerd[1666]: time="2026-01-27T04:47:33.605218414Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 27 04:47:33.607082 systemd[1]: cri-containerd-288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0.scope: Deactivated successfully. Jan 27 04:47:33.607566 systemd[1]: cri-containerd-288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0.scope: Consumed 469ms CPU time, 192.1M memory peak, 165.9M written to disk. Jan 27 04:47:33.608688 containerd[1666]: time="2026-01-27T04:47:33.608648511Z" level=info msg="received container exit event container_id:\"288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0\" id:\"288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0\" pid:3685 exited_at:{seconds:1769489253 nanos:608431630}" Jan 27 04:47:33.611000 audit: BPF prog-id=174 op=UNLOAD Jan 27 04:47:33.629045 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-288af1dcd7f10a7713a185d291a0319be51de93f0d77920a9ed6f12faf448ff0-rootfs.mount: Deactivated successfully. Jan 27 04:47:33.689932 kubelet[2905]: I0127 04:47:33.689895 2905 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 27 04:47:33.721507 kubelet[2905]: W0127 04:47:33.720758 2905 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4592-0-0-n-c2731c5fad" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4592-0-0-n-c2731c5fad' and this object Jan 27 04:47:33.721507 kubelet[2905]: E0127 04:47:33.720796 2905 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4592-0-0-n-c2731c5fad\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4592-0-0-n-c2731c5fad' and this object" logger="UnhandledError" Jan 27 04:47:33.733938 systemd[1]: Created slice kubepods-burstable-pod22752498_52cd_4c33_8c15_87b19e76c80e.slice - libcontainer container kubepods-burstable-pod22752498_52cd_4c33_8c15_87b19e76c80e.slice. Jan 27 04:47:33.745822 systemd[1]: Created slice kubepods-besteffort-pode725b346_d7db_48ca_8580_5074b068cd87.slice - libcontainer container kubepods-besteffort-pode725b346_d7db_48ca_8580_5074b068cd87.slice. Jan 27 04:47:33.753619 systemd[1]: Created slice kubepods-besteffort-pod5edc651f_3273_46b9_a554_3e38c11ea910.slice - libcontainer container kubepods-besteffort-pod5edc651f_3273_46b9_a554_3e38c11ea910.slice. Jan 27 04:47:33.761270 systemd[1]: Created slice kubepods-burstable-pod733d0109_2680_4e52_a103_0ca18ce93ba4.slice - libcontainer container kubepods-burstable-pod733d0109_2680_4e52_a103_0ca18ce93ba4.slice. Jan 27 04:47:33.766868 systemd[1]: Created slice kubepods-besteffort-podb46e8e69_6e20_4188_9b8d_4e06490f6e72.slice - libcontainer container kubepods-besteffort-podb46e8e69_6e20_4188_9b8d_4e06490f6e72.slice. Jan 27 04:47:33.771745 systemd[1]: Created slice kubepods-besteffort-pod9d8212ef_8a8e_4c6c_9b61_a5e5cc87fb40.slice - libcontainer container kubepods-besteffort-pod9d8212ef_8a8e_4c6c_9b61_a5e5cc87fb40.slice. Jan 27 04:47:33.777659 systemd[1]: Created slice kubepods-besteffort-pod1f1398e6_fd52_4fce_b3fa_2a2bc91ba72b.slice - libcontainer container kubepods-besteffort-pod1f1398e6_fd52_4fce_b3fa_2a2bc91ba72b.slice. Jan 27 04:47:33.809604 kubelet[2905]: I0127 04:47:33.809159 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5ffk\" (UniqueName: \"kubernetes.io/projected/22752498-52cd-4c33-8c15-87b19e76c80e-kube-api-access-q5ffk\") pod \"coredns-668d6bf9bc-6svnw\" (UID: \"22752498-52cd-4c33-8c15-87b19e76c80e\") " pod="kube-system/coredns-668d6bf9bc-6svnw" Jan 27 04:47:33.809604 kubelet[2905]: I0127 04:47:33.809217 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b-goldmane-ca-bundle\") pod \"goldmane-666569f655-trbp2\" (UID: \"1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b\") " pod="calico-system/goldmane-666569f655-trbp2" Jan 27 04:47:33.809604 kubelet[2905]: I0127 04:47:33.809236 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wb7k\" (UniqueName: \"kubernetes.io/projected/b46e8e69-6e20-4188-9b8d-4e06490f6e72-kube-api-access-8wb7k\") pod \"calico-apiserver-b5b8d765d-58pbp\" (UID: \"b46e8e69-6e20-4188-9b8d-4e06490f6e72\") " pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" Jan 27 04:47:33.809604 kubelet[2905]: I0127 04:47:33.809252 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbn74\" (UniqueName: \"kubernetes.io/projected/1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b-kube-api-access-zbn74\") pod \"goldmane-666569f655-trbp2\" (UID: \"1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b\") " pod="calico-system/goldmane-666569f655-trbp2" Jan 27 04:47:33.809604 kubelet[2905]: I0127 04:47:33.809272 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmpx8\" (UniqueName: \"kubernetes.io/projected/5edc651f-3273-46b9-a554-3e38c11ea910-kube-api-access-nmpx8\") pod \"calico-kube-controllers-799789d486-wpdhm\" (UID: \"5edc651f-3273-46b9-a554-3e38c11ea910\") " pod="calico-system/calico-kube-controllers-799789d486-wpdhm" Jan 27 04:47:33.809849 kubelet[2905]: I0127 04:47:33.809289 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v99cv\" (UniqueName: \"kubernetes.io/projected/733d0109-2680-4e52-a103-0ca18ce93ba4-kube-api-access-v99cv\") pod \"coredns-668d6bf9bc-gnjg5\" (UID: \"733d0109-2680-4e52-a103-0ca18ce93ba4\") " pod="kube-system/coredns-668d6bf9bc-gnjg5" Jan 27 04:47:33.809849 kubelet[2905]: I0127 04:47:33.809304 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22752498-52cd-4c33-8c15-87b19e76c80e-config-volume\") pod \"coredns-668d6bf9bc-6svnw\" (UID: \"22752498-52cd-4c33-8c15-87b19e76c80e\") " pod="kube-system/coredns-668d6bf9bc-6svnw" Jan 27 04:47:33.809849 kubelet[2905]: I0127 04:47:33.809320 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e725b346-d7db-48ca-8580-5074b068cd87-calico-apiserver-certs\") pod \"calico-apiserver-b5b8d765d-qg4kw\" (UID: \"e725b346-d7db-48ca-8580-5074b068cd87\") " pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" Jan 27 04:47:33.809849 kubelet[2905]: I0127 04:47:33.809338 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b-config\") pod \"goldmane-666569f655-trbp2\" (UID: \"1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b\") " pod="calico-system/goldmane-666569f655-trbp2" Jan 27 04:47:33.809849 kubelet[2905]: I0127 04:47:33.809356 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-whisker-backend-key-pair\") pod \"whisker-7bb68ffc95-5z2g7\" (UID: \"9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40\") " pod="calico-system/whisker-7bb68ffc95-5z2g7" Jan 27 04:47:33.809960 kubelet[2905]: I0127 04:47:33.809377 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5edc651f-3273-46b9-a554-3e38c11ea910-tigera-ca-bundle\") pod \"calico-kube-controllers-799789d486-wpdhm\" (UID: \"5edc651f-3273-46b9-a554-3e38c11ea910\") " pod="calico-system/calico-kube-controllers-799789d486-wpdhm" Jan 27 04:47:33.809960 kubelet[2905]: I0127 04:47:33.809391 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b-goldmane-key-pair\") pod \"goldmane-666569f655-trbp2\" (UID: \"1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b\") " pod="calico-system/goldmane-666569f655-trbp2" Jan 27 04:47:33.809960 kubelet[2905]: I0127 04:47:33.809408 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b46e8e69-6e20-4188-9b8d-4e06490f6e72-calico-apiserver-certs\") pod \"calico-apiserver-b5b8d765d-58pbp\" (UID: \"b46e8e69-6e20-4188-9b8d-4e06490f6e72\") " pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" Jan 27 04:47:33.809960 kubelet[2905]: I0127 04:47:33.809424 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/733d0109-2680-4e52-a103-0ca18ce93ba4-config-volume\") pod \"coredns-668d6bf9bc-gnjg5\" (UID: \"733d0109-2680-4e52-a103-0ca18ce93ba4\") " pod="kube-system/coredns-668d6bf9bc-gnjg5" Jan 27 04:47:33.809960 kubelet[2905]: I0127 04:47:33.809444 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-whisker-ca-bundle\") pod \"whisker-7bb68ffc95-5z2g7\" (UID: \"9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40\") " pod="calico-system/whisker-7bb68ffc95-5z2g7" Jan 27 04:47:33.810065 kubelet[2905]: I0127 04:47:33.809459 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snq4m\" (UniqueName: \"kubernetes.io/projected/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-kube-api-access-snq4m\") pod \"whisker-7bb68ffc95-5z2g7\" (UID: \"9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40\") " pod="calico-system/whisker-7bb68ffc95-5z2g7" Jan 27 04:47:33.810065 kubelet[2905]: I0127 04:47:33.809477 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvs85\" (UniqueName: \"kubernetes.io/projected/e725b346-d7db-48ca-8580-5074b068cd87-kube-api-access-tvs85\") pod \"calico-apiserver-b5b8d765d-qg4kw\" (UID: \"e725b346-d7db-48ca-8580-5074b068cd87\") " pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" Jan 27 04:47:33.831873 systemd[1]: Created slice kubepods-besteffort-pod56e3e6d0_7a6b_4ba3_9081_3231ea811709.slice - libcontainer container kubepods-besteffort-pod56e3e6d0_7a6b_4ba3_9081_3231ea811709.slice. Jan 27 04:47:33.834408 containerd[1666]: time="2026-01-27T04:47:33.834373903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vk94b,Uid:56e3e6d0-7a6b-4ba3-9081-3231ea811709,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:33.907366 containerd[1666]: time="2026-01-27T04:47:33.907232555Z" level=error msg="Failed to destroy network for sandbox \"f22c582b7045cdc2cf8bcce6587f531ede7c06e28fa50e2ca79790a2ad22a8fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:33.915127 containerd[1666]: time="2026-01-27T04:47:33.913501107Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vk94b,Uid:56e3e6d0-7a6b-4ba3-9081-3231ea811709,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f22c582b7045cdc2cf8bcce6587f531ede7c06e28fa50e2ca79790a2ad22a8fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:33.915364 kubelet[2905]: E0127 04:47:33.913756 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f22c582b7045cdc2cf8bcce6587f531ede7c06e28fa50e2ca79790a2ad22a8fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:33.915364 kubelet[2905]: E0127 04:47:33.913827 2905 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f22c582b7045cdc2cf8bcce6587f531ede7c06e28fa50e2ca79790a2ad22a8fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vk94b" Jan 27 04:47:33.915364 kubelet[2905]: E0127 04:47:33.913847 2905 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f22c582b7045cdc2cf8bcce6587f531ede7c06e28fa50e2ca79790a2ad22a8fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vk94b" Jan 27 04:47:33.918229 kubelet[2905]: E0127 04:47:33.913886 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f22c582b7045cdc2cf8bcce6587f531ede7c06e28fa50e2ca79790a2ad22a8fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:47:33.934054 containerd[1666]: time="2026-01-27T04:47:33.933832730Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 27 04:47:34.040992 containerd[1666]: time="2026-01-27T04:47:34.040947677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6svnw,Uid:22752498-52cd-4c33-8c15-87b19e76c80e,Namespace:kube-system,Attempt:0,}" Jan 27 04:47:34.059806 containerd[1666]: time="2026-01-27T04:47:34.059668212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-799789d486-wpdhm,Uid:5edc651f-3273-46b9-a554-3e38c11ea910,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:34.066988 containerd[1666]: time="2026-01-27T04:47:34.065701163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gnjg5,Uid:733d0109-2680-4e52-a103-0ca18ce93ba4,Namespace:kube-system,Attempt:0,}" Jan 27 04:47:34.077721 containerd[1666]: time="2026-01-27T04:47:34.077680744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bb68ffc95-5z2g7,Uid:9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:34.079184 systemd[1]: run-netns-cni\x2deb265148\x2de5b4\x2dea3a\x2db238\x2daefc4645e886.mount: Deactivated successfully. Jan 27 04:47:34.083402 containerd[1666]: time="2026-01-27T04:47:34.083346773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-trbp2,Uid:1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:34.115798 containerd[1666]: time="2026-01-27T04:47:34.115747299Z" level=error msg="Failed to destroy network for sandbox \"5c44ebbcf469fc3f5cce4a42e6ba8ae915d322f7b2a8b62147bd5d8301d9df15\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.130639 containerd[1666]: time="2026-01-27T04:47:34.130567094Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6svnw,Uid:22752498-52cd-4c33-8c15-87b19e76c80e,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c44ebbcf469fc3f5cce4a42e6ba8ae915d322f7b2a8b62147bd5d8301d9df15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.130822 kubelet[2905]: E0127 04:47:34.130782 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c44ebbcf469fc3f5cce4a42e6ba8ae915d322f7b2a8b62147bd5d8301d9df15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.130950 kubelet[2905]: E0127 04:47:34.130840 2905 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c44ebbcf469fc3f5cce4a42e6ba8ae915d322f7b2a8b62147bd5d8301d9df15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6svnw" Jan 27 04:47:34.130950 kubelet[2905]: E0127 04:47:34.130860 2905 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c44ebbcf469fc3f5cce4a42e6ba8ae915d322f7b2a8b62147bd5d8301d9df15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6svnw" Jan 27 04:47:34.130950 kubelet[2905]: E0127 04:47:34.130904 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-6svnw_kube-system(22752498-52cd-4c33-8c15-87b19e76c80e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-6svnw_kube-system(22752498-52cd-4c33-8c15-87b19e76c80e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5c44ebbcf469fc3f5cce4a42e6ba8ae915d322f7b2a8b62147bd5d8301d9df15\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-6svnw" podUID="22752498-52cd-4c33-8c15-87b19e76c80e" Jan 27 04:47:34.151368 containerd[1666]: time="2026-01-27T04:47:34.151234800Z" level=error msg="Failed to destroy network for sandbox \"7319e0f42aa66422ddb612d68c0215ab1d532712ceff7e9283f4543241df1b38\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.155078 containerd[1666]: time="2026-01-27T04:47:34.155026139Z" level=error msg="Failed to destroy network for sandbox \"678e90fb014f08158bc6d99d3b974d2d5e5780550ddf6e7af0119b4652acd4a9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.165799 containerd[1666]: time="2026-01-27T04:47:34.165695273Z" level=error msg="Failed to destroy network for sandbox \"d1a61069b59736886c869973f1bafe8fdb2725c5ea4b437cdbbe49a32ca51532\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.171969 containerd[1666]: time="2026-01-27T04:47:34.171920585Z" level=error msg="Failed to destroy network for sandbox \"c02bac3d1d4a640259443a99017040931557b3c7e817220f31e8fde085066fb1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.172615 containerd[1666]: time="2026-01-27T04:47:34.172575708Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-trbp2,Uid:1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"678e90fb014f08158bc6d99d3b974d2d5e5780550ddf6e7af0119b4652acd4a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.172833 kubelet[2905]: E0127 04:47:34.172789 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"678e90fb014f08158bc6d99d3b974d2d5e5780550ddf6e7af0119b4652acd4a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.172885 kubelet[2905]: E0127 04:47:34.172858 2905 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"678e90fb014f08158bc6d99d3b974d2d5e5780550ddf6e7af0119b4652acd4a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-trbp2" Jan 27 04:47:34.172885 kubelet[2905]: E0127 04:47:34.172877 2905 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"678e90fb014f08158bc6d99d3b974d2d5e5780550ddf6e7af0119b4652acd4a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-trbp2" Jan 27 04:47:34.172956 kubelet[2905]: E0127 04:47:34.172931 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-trbp2_calico-system(1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-trbp2_calico-system(1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"678e90fb014f08158bc6d99d3b974d2d5e5780550ddf6e7af0119b4652acd4a9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:47:34.180356 containerd[1666]: time="2026-01-27T04:47:34.180260468Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-799789d486-wpdhm,Uid:5edc651f-3273-46b9-a554-3e38c11ea910,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7319e0f42aa66422ddb612d68c0215ab1d532712ceff7e9283f4543241df1b38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.180692 kubelet[2905]: E0127 04:47:34.180543 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7319e0f42aa66422ddb612d68c0215ab1d532712ceff7e9283f4543241df1b38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.180692 kubelet[2905]: E0127 04:47:34.180652 2905 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7319e0f42aa66422ddb612d68c0215ab1d532712ceff7e9283f4543241df1b38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" Jan 27 04:47:34.180692 kubelet[2905]: E0127 04:47:34.180671 2905 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7319e0f42aa66422ddb612d68c0215ab1d532712ceff7e9283f4543241df1b38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" Jan 27 04:47:34.180794 kubelet[2905]: E0127 04:47:34.180709 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-799789d486-wpdhm_calico-system(5edc651f-3273-46b9-a554-3e38c11ea910)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-799789d486-wpdhm_calico-system(5edc651f-3273-46b9-a554-3e38c11ea910)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7319e0f42aa66422ddb612d68c0215ab1d532712ceff7e9283f4543241df1b38\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:47:34.187424 containerd[1666]: time="2026-01-27T04:47:34.187373384Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gnjg5,Uid:733d0109-2680-4e52-a103-0ca18ce93ba4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02bac3d1d4a640259443a99017040931557b3c7e817220f31e8fde085066fb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.188215 kubelet[2905]: E0127 04:47:34.187641 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02bac3d1d4a640259443a99017040931557b3c7e817220f31e8fde085066fb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.188215 kubelet[2905]: E0127 04:47:34.187681 2905 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02bac3d1d4a640259443a99017040931557b3c7e817220f31e8fde085066fb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gnjg5" Jan 27 04:47:34.188215 kubelet[2905]: E0127 04:47:34.187696 2905 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02bac3d1d4a640259443a99017040931557b3c7e817220f31e8fde085066fb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gnjg5" Jan 27 04:47:34.188337 kubelet[2905]: E0127 04:47:34.187735 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-gnjg5_kube-system(733d0109-2680-4e52-a103-0ca18ce93ba4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-gnjg5_kube-system(733d0109-2680-4e52-a103-0ca18ce93ba4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c02bac3d1d4a640259443a99017040931557b3c7e817220f31e8fde085066fb1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-gnjg5" podUID="733d0109-2680-4e52-a103-0ca18ce93ba4" Jan 27 04:47:34.189901 containerd[1666]: time="2026-01-27T04:47:34.189840837Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bb68ffc95-5z2g7,Uid:9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a61069b59736886c869973f1bafe8fdb2725c5ea4b437cdbbe49a32ca51532\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.190025 kubelet[2905]: E0127 04:47:34.189992 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a61069b59736886c869973f1bafe8fdb2725c5ea4b437cdbbe49a32ca51532\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:34.190065 kubelet[2905]: E0127 04:47:34.190032 2905 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a61069b59736886c869973f1bafe8fdb2725c5ea4b437cdbbe49a32ca51532\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7bb68ffc95-5z2g7" Jan 27 04:47:34.190065 kubelet[2905]: E0127 04:47:34.190050 2905 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a61069b59736886c869973f1bafe8fdb2725c5ea4b437cdbbe49a32ca51532\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7bb68ffc95-5z2g7" Jan 27 04:47:34.190144 kubelet[2905]: E0127 04:47:34.190082 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7bb68ffc95-5z2g7_calico-system(9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7bb68ffc95-5z2g7_calico-system(9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1a61069b59736886c869973f1bafe8fdb2725c5ea4b437cdbbe49a32ca51532\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7bb68ffc95-5z2g7" podUID="9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40" Jan 27 04:47:34.927406 kubelet[2905]: E0127 04:47:34.927225 2905 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 27 04:47:34.927406 kubelet[2905]: E0127 04:47:34.927292 2905 projected.go:194] Error preparing data for projected volume kube-api-access-tvs85 for pod calico-apiserver/calico-apiserver-b5b8d765d-qg4kw: failed to sync configmap cache: timed out waiting for the condition Jan 27 04:47:34.929200 kubelet[2905]: E0127 04:47:34.927399 2905 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e725b346-d7db-48ca-8580-5074b068cd87-kube-api-access-tvs85 podName:e725b346-d7db-48ca-8580-5074b068cd87 nodeName:}" failed. No retries permitted until 2026-01-27 04:47:35.427362119 +0000 UTC m=+31.684973612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tvs85" (UniqueName: "kubernetes.io/projected/e725b346-d7db-48ca-8580-5074b068cd87-kube-api-access-tvs85") pod "calico-apiserver-b5b8d765d-qg4kw" (UID: "e725b346-d7db-48ca-8580-5074b068cd87") : failed to sync configmap cache: timed out waiting for the condition Jan 27 04:47:34.932655 kubelet[2905]: E0127 04:47:34.932629 2905 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 27 04:47:34.932655 kubelet[2905]: E0127 04:47:34.932657 2905 projected.go:194] Error preparing data for projected volume kube-api-access-8wb7k for pod calico-apiserver/calico-apiserver-b5b8d765d-58pbp: failed to sync configmap cache: timed out waiting for the condition Jan 27 04:47:34.932763 kubelet[2905]: E0127 04:47:34.932702 2905 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b46e8e69-6e20-4188-9b8d-4e06490f6e72-kube-api-access-8wb7k podName:b46e8e69-6e20-4188-9b8d-4e06490f6e72 nodeName:}" failed. No retries permitted until 2026-01-27 04:47:35.432687986 +0000 UTC m=+31.690299439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8wb7k" (UniqueName: "kubernetes.io/projected/b46e8e69-6e20-4188-9b8d-4e06490f6e72-kube-api-access-8wb7k") pod "calico-apiserver-b5b8d765d-58pbp" (UID: "b46e8e69-6e20-4188-9b8d-4e06490f6e72") : failed to sync configmap cache: timed out waiting for the condition Jan 27 04:47:35.067862 systemd[1]: run-netns-cni\x2dd433bdb2\x2d0494\x2da5ef\x2d0c3f\x2d85167995a53a.mount: Deactivated successfully. Jan 27 04:47:35.067958 systemd[1]: run-netns-cni\x2deb30d3ee\x2d6769\x2dc9c5\x2d048d\x2dd9986e5a16cf.mount: Deactivated successfully. Jan 27 04:47:35.068009 systemd[1]: run-netns-cni\x2d369ee089\x2dc20e\x2d5e14\x2d724a\x2dcb05d96b5a40.mount: Deactivated successfully. Jan 27 04:47:35.068055 systemd[1]: run-netns-cni\x2dcba46b7a\x2db7c9\x2d9b4a\x2d3dcf\x2da9bfa19f5f19.mount: Deactivated successfully. Jan 27 04:47:35.068123 systemd[1]: run-netns-cni\x2db8027b6b\x2df58d\x2d92b5\x2d51ad\x2dff048aa87b69.mount: Deactivated successfully. Jan 27 04:47:35.551787 containerd[1666]: time="2026-01-27T04:47:35.551746264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b5b8d765d-qg4kw,Uid:e725b346-d7db-48ca-8580-5074b068cd87,Namespace:calico-apiserver,Attempt:0,}" Jan 27 04:47:35.570930 containerd[1666]: time="2026-01-27T04:47:35.570752401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b5b8d765d-58pbp,Uid:b46e8e69-6e20-4188-9b8d-4e06490f6e72,Namespace:calico-apiserver,Attempt:0,}" Jan 27 04:47:35.618291 containerd[1666]: time="2026-01-27T04:47:35.618218444Z" level=error msg="Failed to destroy network for sandbox \"9ebfe93cdf85b0bcc9ab4b41689e6394622a4d76d75bb79b0c10e3d8c39f4aee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:35.624752 containerd[1666]: time="2026-01-27T04:47:35.624671676Z" level=error msg="Failed to destroy network for sandbox \"f88ff977f848b36abf759c7510ec7c2cbe06ae3e2a5af2b84dbb48e253a98fc0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:35.625711 containerd[1666]: time="2026-01-27T04:47:35.625649921Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b5b8d765d-qg4kw,Uid:e725b346-d7db-48ca-8580-5074b068cd87,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ebfe93cdf85b0bcc9ab4b41689e6394622a4d76d75bb79b0c10e3d8c39f4aee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:35.625952 kubelet[2905]: E0127 04:47:35.625901 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ebfe93cdf85b0bcc9ab4b41689e6394622a4d76d75bb79b0c10e3d8c39f4aee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:35.626047 kubelet[2905]: E0127 04:47:35.625964 2905 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ebfe93cdf85b0bcc9ab4b41689e6394622a4d76d75bb79b0c10e3d8c39f4aee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" Jan 27 04:47:35.626047 kubelet[2905]: E0127 04:47:35.625989 2905 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ebfe93cdf85b0bcc9ab4b41689e6394622a4d76d75bb79b0c10e3d8c39f4aee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" Jan 27 04:47:35.626133 kubelet[2905]: E0127 04:47:35.626034 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b5b8d765d-qg4kw_calico-apiserver(e725b346-d7db-48ca-8580-5074b068cd87)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b5b8d765d-qg4kw_calico-apiserver(e725b346-d7db-48ca-8580-5074b068cd87)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9ebfe93cdf85b0bcc9ab4b41689e6394622a4d76d75bb79b0c10e3d8c39f4aee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:47:35.630519 containerd[1666]: time="2026-01-27T04:47:35.630463386Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b5b8d765d-58pbp,Uid:b46e8e69-6e20-4188-9b8d-4e06490f6e72,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f88ff977f848b36abf759c7510ec7c2cbe06ae3e2a5af2b84dbb48e253a98fc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:35.630688 kubelet[2905]: E0127 04:47:35.630642 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f88ff977f848b36abf759c7510ec7c2cbe06ae3e2a5af2b84dbb48e253a98fc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 27 04:47:35.630738 kubelet[2905]: E0127 04:47:35.630705 2905 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f88ff977f848b36abf759c7510ec7c2cbe06ae3e2a5af2b84dbb48e253a98fc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" Jan 27 04:47:35.630768 kubelet[2905]: E0127 04:47:35.630724 2905 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f88ff977f848b36abf759c7510ec7c2cbe06ae3e2a5af2b84dbb48e253a98fc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" Jan 27 04:47:35.630811 kubelet[2905]: E0127 04:47:35.630787 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b5b8d765d-58pbp_calico-apiserver(b46e8e69-6e20-4188-9b8d-4e06490f6e72)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b5b8d765d-58pbp_calico-apiserver(b46e8e69-6e20-4188-9b8d-4e06490f6e72)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f88ff977f848b36abf759c7510ec7c2cbe06ae3e2a5af2b84dbb48e253a98fc0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:47:36.068048 systemd[1]: run-netns-cni\x2d4c0eb030\x2ddea8\x2d7363\x2d3b7b\x2d2e3e482d2ed8.mount: Deactivated successfully. Jan 27 04:47:36.068154 systemd[1]: run-netns-cni\x2d1b5c8016\x2dcc8a\x2da485\x2d7031\x2d18ce514ea9fc.mount: Deactivated successfully. Jan 27 04:47:39.048460 kubelet[2905]: I0127 04:47:39.048400 2905 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 04:47:39.079000 audit[3992]: NETFILTER_CFG table=filter:119 family=2 entries=21 op=nft_register_rule pid=3992 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:39.081347 kernel: kauditd_printk_skb: 44 callbacks suppressed Jan 27 04:47:39.081429 kernel: audit: type=1325 audit(1769489259.079:586): table=filter:119 family=2 entries=21 op=nft_register_rule pid=3992 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:39.079000 audit[3992]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff661b080 a2=0 a3=1 items=0 ppid=3073 pid=3992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:39.086139 kernel: audit: type=1300 audit(1769489259.079:586): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff661b080 a2=0 a3=1 items=0 ppid=3073 pid=3992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:39.086220 kernel: audit: type=1327 audit(1769489259.079:586): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:39.079000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:39.085000 audit[3992]: NETFILTER_CFG table=nat:120 family=2 entries=19 op=nft_register_chain pid=3992 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:39.089527 kernel: audit: type=1325 audit(1769489259.085:587): table=nat:120 family=2 entries=19 op=nft_register_chain pid=3992 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:39.089597 kernel: audit: type=1300 audit(1769489259.085:587): arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=fffff661b080 a2=0 a3=1 items=0 ppid=3073 pid=3992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:39.085000 audit[3992]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=fffff661b080 a2=0 a3=1 items=0 ppid=3073 pid=3992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:39.085000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:39.094469 kernel: audit: type=1327 audit(1769489259.085:587): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:41.208530 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount444098548.mount: Deactivated successfully. Jan 27 04:47:41.244180 containerd[1666]: time="2026-01-27T04:47:41.243752023Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:41.245014 containerd[1666]: time="2026-01-27T04:47:41.244933829Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 27 04:47:41.246674 containerd[1666]: time="2026-01-27T04:47:41.246621078Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:41.249044 containerd[1666]: time="2026-01-27T04:47:41.249002770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 27 04:47:41.249699 containerd[1666]: time="2026-01-27T04:47:41.249664173Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 7.315787842s" Jan 27 04:47:41.249699 containerd[1666]: time="2026-01-27T04:47:41.249694013Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 27 04:47:41.256186 containerd[1666]: time="2026-01-27T04:47:41.256150326Z" level=info msg="CreateContainer within sandbox \"bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 27 04:47:41.270338 containerd[1666]: time="2026-01-27T04:47:41.270191078Z" level=info msg="Container 3e10a73c674fde052d93a3780b731b936152976ebfe4bd0a2d7a4ae6329556bc: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:47:41.278949 containerd[1666]: time="2026-01-27T04:47:41.278881562Z" level=info msg="CreateContainer within sandbox \"bc9194740cba4ec6d82a3ddd1b330c22515536dfa2343fcecc566d8a0da8a3a1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"3e10a73c674fde052d93a3780b731b936152976ebfe4bd0a2d7a4ae6329556bc\"" Jan 27 04:47:41.280119 containerd[1666]: time="2026-01-27T04:47:41.279563566Z" level=info msg="StartContainer for \"3e10a73c674fde052d93a3780b731b936152976ebfe4bd0a2d7a4ae6329556bc\"" Jan 27 04:47:41.281171 containerd[1666]: time="2026-01-27T04:47:41.281136934Z" level=info msg="connecting to shim 3e10a73c674fde052d93a3780b731b936152976ebfe4bd0a2d7a4ae6329556bc" address="unix:///run/containerd/s/18f7b974747ceab1977f7f6c0e7380928d9b901831411d9f96d20341dcf7317a" protocol=ttrpc version=3 Jan 27 04:47:41.299327 systemd[1]: Started cri-containerd-3e10a73c674fde052d93a3780b731b936152976ebfe4bd0a2d7a4ae6329556bc.scope - libcontainer container 3e10a73c674fde052d93a3780b731b936152976ebfe4bd0a2d7a4ae6329556bc. Jan 27 04:47:41.352000 audit: BPF prog-id=175 op=LOAD Jan 27 04:47:41.352000 audit[4001]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3422 pid=4001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:41.357616 kernel: audit: type=1334 audit(1769489261.352:588): prog-id=175 op=LOAD Jan 27 04:47:41.357666 kernel: audit: type=1300 audit(1769489261.352:588): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3422 pid=4001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:41.352000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365313061373363363734666465303532643933613337383062373331 Jan 27 04:47:41.361036 kernel: audit: type=1327 audit(1769489261.352:588): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365313061373363363734666465303532643933613337383062373331 Jan 27 04:47:41.361159 kernel: audit: type=1334 audit(1769489261.353:589): prog-id=176 op=LOAD Jan 27 04:47:41.353000 audit: BPF prog-id=176 op=LOAD Jan 27 04:47:41.353000 audit[4001]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3422 pid=4001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:41.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365313061373363363734666465303532643933613337383062373331 Jan 27 04:47:41.353000 audit: BPF prog-id=176 op=UNLOAD Jan 27 04:47:41.353000 audit[4001]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3422 pid=4001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:41.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365313061373363363734666465303532643933613337383062373331 Jan 27 04:47:41.353000 audit: BPF prog-id=175 op=UNLOAD Jan 27 04:47:41.353000 audit[4001]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3422 pid=4001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:41.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365313061373363363734666465303532643933613337383062373331 Jan 27 04:47:41.353000 audit: BPF prog-id=177 op=LOAD Jan 27 04:47:41.353000 audit[4001]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3422 pid=4001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:41.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365313061373363363734666465303532643933613337383062373331 Jan 27 04:47:41.380677 containerd[1666]: time="2026-01-27T04:47:41.380635802Z" level=info msg="StartContainer for \"3e10a73c674fde052d93a3780b731b936152976ebfe4bd0a2d7a4ae6329556bc\" returns successfully" Jan 27 04:47:41.522476 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 27 04:47:41.522674 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 27 04:47:41.660370 kubelet[2905]: I0127 04:47:41.660322 2905 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snq4m\" (UniqueName: \"kubernetes.io/projected/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-kube-api-access-snq4m\") pod \"9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40\" (UID: \"9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40\") " Jan 27 04:47:41.660370 kubelet[2905]: I0127 04:47:41.660373 2905 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-whisker-backend-key-pair\") pod \"9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40\" (UID: \"9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40\") " Jan 27 04:47:41.660738 kubelet[2905]: I0127 04:47:41.660405 2905 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-whisker-ca-bundle\") pod \"9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40\" (UID: \"9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40\") " Jan 27 04:47:41.661132 kubelet[2905]: I0127 04:47:41.660771 2905 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40" (UID: "9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 27 04:47:41.664023 kubelet[2905]: I0127 04:47:41.663977 2905 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-kube-api-access-snq4m" (OuterVolumeSpecName: "kube-api-access-snq4m") pod "9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40" (UID: "9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40"). InnerVolumeSpecName "kube-api-access-snq4m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 27 04:47:41.667202 kubelet[2905]: I0127 04:47:41.667154 2905 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40" (UID: "9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 27 04:47:41.761334 kubelet[2905]: I0127 04:47:41.761280 2905 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snq4m\" (UniqueName: \"kubernetes.io/projected/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-kube-api-access-snq4m\") on node \"ci-4592-0-0-n-c2731c5fad\" DevicePath \"\"" Jan 27 04:47:41.761334 kubelet[2905]: I0127 04:47:41.761321 2905 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-whisker-backend-key-pair\") on node \"ci-4592-0-0-n-c2731c5fad\" DevicePath \"\"" Jan 27 04:47:41.761334 kubelet[2905]: I0127 04:47:41.761333 2905 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40-whisker-ca-bundle\") on node \"ci-4592-0-0-n-c2731c5fad\" DevicePath \"\"" Jan 27 04:47:41.832708 systemd[1]: Removed slice kubepods-besteffort-pod9d8212ef_8a8e_4c6c_9b61_a5e5cc87fb40.slice - libcontainer container kubepods-besteffort-pod9d8212ef_8a8e_4c6c_9b61_a5e5cc87fb40.slice. Jan 27 04:47:41.968882 kubelet[2905]: I0127 04:47:41.968806 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-cb7q9" podStartSLOduration=1.225648827 podStartE2EDuration="15.968791002s" podCreationTimestamp="2026-01-27 04:47:26 +0000 UTC" firstStartedPulling="2026-01-27 04:47:26.507382803 +0000 UTC m=+22.764994216" lastFinishedPulling="2026-01-27 04:47:41.250524978 +0000 UTC m=+37.508136391" observedRunningTime="2026-01-27 04:47:41.968530401 +0000 UTC m=+38.226141894" watchObservedRunningTime="2026-01-27 04:47:41.968791002 +0000 UTC m=+38.226402455" Jan 27 04:47:42.027955 systemd[1]: Created slice kubepods-besteffort-poda9e11ad0_9b2a_4daf_88aa_0d2f20b9fa33.slice - libcontainer container kubepods-besteffort-poda9e11ad0_9b2a_4daf_88aa_0d2f20b9fa33.slice. Jan 27 04:47:42.063852 kubelet[2905]: I0127 04:47:42.063764 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvk54\" (UniqueName: \"kubernetes.io/projected/a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33-kube-api-access-xvk54\") pod \"whisker-68746999d7-v2tpn\" (UID: \"a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33\") " pod="calico-system/whisker-68746999d7-v2tpn" Jan 27 04:47:42.063852 kubelet[2905]: I0127 04:47:42.063830 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33-whisker-backend-key-pair\") pod \"whisker-68746999d7-v2tpn\" (UID: \"a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33\") " pod="calico-system/whisker-68746999d7-v2tpn" Jan 27 04:47:42.063852 kubelet[2905]: I0127 04:47:42.063850 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33-whisker-ca-bundle\") pod \"whisker-68746999d7-v2tpn\" (UID: \"a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33\") " pod="calico-system/whisker-68746999d7-v2tpn" Jan 27 04:47:42.211372 systemd[1]: var-lib-kubelet-pods-9d8212ef\x2d8a8e\x2d4c6c\x2d9b61\x2da5e5cc87fb40-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dsnq4m.mount: Deactivated successfully. Jan 27 04:47:42.211475 systemd[1]: var-lib-kubelet-pods-9d8212ef\x2d8a8e\x2d4c6c\x2d9b61\x2da5e5cc87fb40-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 27 04:47:42.332889 containerd[1666]: time="2026-01-27T04:47:42.332625058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68746999d7-v2tpn,Uid:a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:42.474950 systemd-networkd[1495]: calie61560b51f1: Link UP Jan 27 04:47:42.475136 systemd-networkd[1495]: calie61560b51f1: Gained carrier Jan 27 04:47:42.489341 containerd[1666]: 2026-01-27 04:47:42.367 [INFO][4067] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 27 04:47:42.489341 containerd[1666]: 2026-01-27 04:47:42.386 [INFO][4067] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0 whisker-68746999d7- calico-system a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33 900 0 2026-01-27 04:47:42 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:68746999d7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4592-0-0-n-c2731c5fad whisker-68746999d7-v2tpn eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calie61560b51f1 [] [] }} ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Namespace="calico-system" Pod="whisker-68746999d7-v2tpn" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-" Jan 27 04:47:42.489341 containerd[1666]: 2026-01-27 04:47:42.386 [INFO][4067] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Namespace="calico-system" Pod="whisker-68746999d7-v2tpn" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" Jan 27 04:47:42.489341 containerd[1666]: 2026-01-27 04:47:42.429 [INFO][4082] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" HandleID="k8s-pod-network.9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Workload="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" Jan 27 04:47:42.489546 containerd[1666]: 2026-01-27 04:47:42.429 [INFO][4082] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" HandleID="k8s-pod-network.9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Workload="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2fd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4592-0-0-n-c2731c5fad", "pod":"whisker-68746999d7-v2tpn", "timestamp":"2026-01-27 04:47:42.429601193 +0000 UTC"}, Hostname:"ci-4592-0-0-n-c2731c5fad", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 04:47:42.489546 containerd[1666]: 2026-01-27 04:47:42.429 [INFO][4082] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 04:47:42.489546 containerd[1666]: 2026-01-27 04:47:42.429 [INFO][4082] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 04:47:42.489546 containerd[1666]: 2026-01-27 04:47:42.429 [INFO][4082] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-n-c2731c5fad' Jan 27 04:47:42.489546 containerd[1666]: 2026-01-27 04:47:42.440 [INFO][4082] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:42.489546 containerd[1666]: 2026-01-27 04:47:42.445 [INFO][4082] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:42.489546 containerd[1666]: 2026-01-27 04:47:42.450 [INFO][4082] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:42.489546 containerd[1666]: 2026-01-27 04:47:42.452 [INFO][4082] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:42.489546 containerd[1666]: 2026-01-27 04:47:42.454 [INFO][4082] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:42.489722 containerd[1666]: 2026-01-27 04:47:42.454 [INFO][4082] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:42.489722 containerd[1666]: 2026-01-27 04:47:42.455 [INFO][4082] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b Jan 27 04:47:42.489722 containerd[1666]: 2026-01-27 04:47:42.459 [INFO][4082] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:42.489722 containerd[1666]: 2026-01-27 04:47:42.465 [INFO][4082] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.1/26] block=192.168.98.0/26 handle="k8s-pod-network.9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:42.489722 containerd[1666]: 2026-01-27 04:47:42.465 [INFO][4082] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.1/26] handle="k8s-pod-network.9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:42.489722 containerd[1666]: 2026-01-27 04:47:42.465 [INFO][4082] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 04:47:42.489722 containerd[1666]: 2026-01-27 04:47:42.465 [INFO][4082] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.1/26] IPv6=[] ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" HandleID="k8s-pod-network.9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Workload="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" Jan 27 04:47:42.489849 containerd[1666]: 2026-01-27 04:47:42.468 [INFO][4067] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Namespace="calico-system" Pod="whisker-68746999d7-v2tpn" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0", GenerateName:"whisker-68746999d7-", Namespace:"calico-system", SelfLink:"", UID:"a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33", ResourceVersion:"900", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"68746999d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"", Pod:"whisker-68746999d7-v2tpn", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie61560b51f1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:42.489849 containerd[1666]: 2026-01-27 04:47:42.468 [INFO][4067] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.1/32] ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Namespace="calico-system" Pod="whisker-68746999d7-v2tpn" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" Jan 27 04:47:42.489916 containerd[1666]: 2026-01-27 04:47:42.468 [INFO][4067] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie61560b51f1 ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Namespace="calico-system" Pod="whisker-68746999d7-v2tpn" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" Jan 27 04:47:42.489916 containerd[1666]: 2026-01-27 04:47:42.474 [INFO][4067] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Namespace="calico-system" Pod="whisker-68746999d7-v2tpn" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" Jan 27 04:47:42.489969 containerd[1666]: 2026-01-27 04:47:42.475 [INFO][4067] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Namespace="calico-system" Pod="whisker-68746999d7-v2tpn" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0", GenerateName:"whisker-68746999d7-", Namespace:"calico-system", SelfLink:"", UID:"a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33", ResourceVersion:"900", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"68746999d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b", Pod:"whisker-68746999d7-v2tpn", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.98.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie61560b51f1", MAC:"6e:25:fa:8a:df:a6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:42.490018 containerd[1666]: 2026-01-27 04:47:42.487 [INFO][4067] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" Namespace="calico-system" Pod="whisker-68746999d7-v2tpn" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-whisker--68746999d7--v2tpn-eth0" Jan 27 04:47:42.524691 containerd[1666]: time="2026-01-27T04:47:42.524626078Z" level=info msg="connecting to shim 9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b" address="unix:///run/containerd/s/c8f7cbb21511acc67dda0fe9c310d58e3369d430fa0738bde0b2e1ed02122609" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:42.548419 systemd[1]: Started cri-containerd-9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b.scope - libcontainer container 9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b. Jan 27 04:47:42.556000 audit: BPF prog-id=178 op=LOAD Jan 27 04:47:42.557000 audit: BPF prog-id=179 op=LOAD Jan 27 04:47:42.557000 audit[4117]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4106 pid=4117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931373563633335626539613137366532323938666432353364346438 Jan 27 04:47:42.557000 audit: BPF prog-id=179 op=UNLOAD Jan 27 04:47:42.557000 audit[4117]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4106 pid=4117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931373563633335626539613137366532323938666432353364346438 Jan 27 04:47:42.557000 audit: BPF prog-id=180 op=LOAD Jan 27 04:47:42.557000 audit[4117]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4106 pid=4117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931373563633335626539613137366532323938666432353364346438 Jan 27 04:47:42.557000 audit: BPF prog-id=181 op=LOAD Jan 27 04:47:42.557000 audit[4117]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4106 pid=4117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931373563633335626539613137366532323938666432353364346438 Jan 27 04:47:42.557000 audit: BPF prog-id=181 op=UNLOAD Jan 27 04:47:42.557000 audit[4117]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4106 pid=4117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931373563633335626539613137366532323938666432353364346438 Jan 27 04:47:42.557000 audit: BPF prog-id=180 op=UNLOAD Jan 27 04:47:42.557000 audit[4117]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4106 pid=4117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931373563633335626539613137366532323938666432353364346438 Jan 27 04:47:42.557000 audit: BPF prog-id=182 op=LOAD Jan 27 04:47:42.557000 audit[4117]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4106 pid=4117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931373563633335626539613137366532323938666432353364346438 Jan 27 04:47:42.589946 containerd[1666]: time="2026-01-27T04:47:42.589826131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68746999d7-v2tpn,Uid:a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33,Namespace:calico-system,Attempt:0,} returns sandbox id \"9175cc35be9a176e2298fd253d4d8d746f43f6e3bc7919f60e71a2402843284b\"" Jan 27 04:47:42.592317 containerd[1666]: time="2026-01-27T04:47:42.591531499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 04:47:42.935811 containerd[1666]: time="2026-01-27T04:47:42.934132247Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:42.941859 containerd[1666]: time="2026-01-27T04:47:42.941806206Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 04:47:42.941973 containerd[1666]: time="2026-01-27T04:47:42.941875367Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:42.942957 kubelet[2905]: E0127 04:47:42.942923 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:47:42.943228 kubelet[2905]: E0127 04:47:42.942977 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:47:42.943254 kubelet[2905]: E0127 04:47:42.943208 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1c7c961a1fbc4629aa41d023389e3c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:42.947961 containerd[1666]: time="2026-01-27T04:47:42.947922558Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 04:47:42.960000 audit: BPF prog-id=183 op=LOAD Jan 27 04:47:42.960000 audit[4254]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd5d159d8 a2=98 a3=ffffd5d159c8 items=0 ppid=4155 pid=4254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.960000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 04:47:42.960000 audit: BPF prog-id=183 op=UNLOAD Jan 27 04:47:42.960000 audit[4254]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd5d159a8 a3=0 items=0 ppid=4155 pid=4254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.960000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 04:47:42.961000 audit: BPF prog-id=184 op=LOAD Jan 27 04:47:42.961000 audit[4254]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd5d15888 a2=74 a3=95 items=0 ppid=4155 pid=4254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.961000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 04:47:42.961000 audit: BPF prog-id=184 op=UNLOAD Jan 27 04:47:42.961000 audit[4254]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4155 pid=4254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.961000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 04:47:42.961000 audit: BPF prog-id=185 op=LOAD Jan 27 04:47:42.961000 audit[4254]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd5d158b8 a2=40 a3=ffffd5d158e8 items=0 ppid=4155 pid=4254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.961000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 04:47:42.961000 audit: BPF prog-id=185 op=UNLOAD Jan 27 04:47:42.961000 audit[4254]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffd5d158e8 items=0 ppid=4155 pid=4254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.961000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 27 04:47:42.963000 audit: BPF prog-id=186 op=LOAD Jan 27 04:47:42.963000 audit[4255]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe1ca9ed8 a2=98 a3=ffffe1ca9ec8 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.963000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:42.963000 audit: BPF prog-id=186 op=UNLOAD Jan 27 04:47:42.963000 audit[4255]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe1ca9ea8 a3=0 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.963000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:42.964000 audit: BPF prog-id=187 op=LOAD Jan 27 04:47:42.964000 audit[4255]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe1ca9b68 a2=74 a3=95 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.964000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:42.964000 audit: BPF prog-id=187 op=UNLOAD Jan 27 04:47:42.964000 audit[4255]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.964000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:42.964000 audit: BPF prog-id=188 op=LOAD Jan 27 04:47:42.964000 audit[4255]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe1ca9bc8 a2=94 a3=2 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.964000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:42.964000 audit: BPF prog-id=188 op=UNLOAD Jan 27 04:47:42.964000 audit[4255]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:42.964000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.077000 audit: BPF prog-id=189 op=LOAD Jan 27 04:47:43.077000 audit[4255]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe1ca9b88 a2=40 a3=ffffe1ca9bb8 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.077000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.077000 audit: BPF prog-id=189 op=UNLOAD Jan 27 04:47:43.077000 audit[4255]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffe1ca9bb8 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.077000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.087000 audit: BPF prog-id=190 op=LOAD Jan 27 04:47:43.087000 audit[4255]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffe1ca9b98 a2=94 a3=4 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.087000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.087000 audit: BPF prog-id=190 op=UNLOAD Jan 27 04:47:43.087000 audit[4255]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.087000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.087000 audit: BPF prog-id=191 op=LOAD Jan 27 04:47:43.087000 audit[4255]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe1ca99d8 a2=94 a3=5 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.087000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.088000 audit: BPF prog-id=191 op=UNLOAD Jan 27 04:47:43.088000 audit[4255]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.088000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.088000 audit: BPF prog-id=192 op=LOAD Jan 27 04:47:43.088000 audit[4255]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffe1ca9c08 a2=94 a3=6 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.088000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.088000 audit: BPF prog-id=192 op=UNLOAD Jan 27 04:47:43.088000 audit[4255]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.088000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.088000 audit: BPF prog-id=193 op=LOAD Jan 27 04:47:43.088000 audit[4255]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffe1ca93d8 a2=94 a3=83 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.088000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.089000 audit: BPF prog-id=194 op=LOAD Jan 27 04:47:43.089000 audit[4255]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffe1ca9198 a2=94 a3=2 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.089000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.089000 audit: BPF prog-id=194 op=UNLOAD Jan 27 04:47:43.089000 audit[4255]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.089000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.089000 audit: BPF prog-id=193 op=UNLOAD Jan 27 04:47:43.089000 audit[4255]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=37bfa620 a3=37bedb00 items=0 ppid=4155 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.089000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 27 04:47:43.099000 audit: BPF prog-id=195 op=LOAD Jan 27 04:47:43.099000 audit[4297]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe8445438 a2=98 a3=ffffe8445428 items=0 ppid=4155 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.099000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 04:47:43.100000 audit: BPF prog-id=195 op=UNLOAD Jan 27 04:47:43.100000 audit[4297]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe8445408 a3=0 items=0 ppid=4155 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 04:47:43.100000 audit: BPF prog-id=196 op=LOAD Jan 27 04:47:43.100000 audit[4297]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe84452e8 a2=74 a3=95 items=0 ppid=4155 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 04:47:43.100000 audit: BPF prog-id=196 op=UNLOAD Jan 27 04:47:43.100000 audit[4297]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4155 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 04:47:43.100000 audit: BPF prog-id=197 op=LOAD Jan 27 04:47:43.100000 audit[4297]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe8445318 a2=40 a3=ffffe8445348 items=0 ppid=4155 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 04:47:43.100000 audit: BPF prog-id=197 op=UNLOAD Jan 27 04:47:43.100000 audit[4297]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffe8445348 items=0 ppid=4155 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.100000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 27 04:47:43.159265 systemd-networkd[1495]: vxlan.calico: Link UP Jan 27 04:47:43.159273 systemd-networkd[1495]: vxlan.calico: Gained carrier Jan 27 04:47:43.175000 audit: BPF prog-id=198 op=LOAD Jan 27 04:47:43.175000 audit[4323]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd045eed8 a2=98 a3=ffffd045eec8 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.175000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.175000 audit: BPF prog-id=198 op=UNLOAD Jan 27 04:47:43.175000 audit[4323]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd045eea8 a3=0 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.175000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=199 op=LOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd045ebb8 a2=74 a3=95 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=199 op=UNLOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=200 op=LOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd045ec18 a2=94 a3=2 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=200 op=UNLOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=201 op=LOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd045ea98 a2=40 a3=ffffd045eac8 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=201 op=UNLOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffd045eac8 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=202 op=LOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd045ebe8 a2=94 a3=b7 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=202 op=UNLOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=203 op=LOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd045e298 a2=94 a3=2 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=203 op=UNLOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.176000 audit: BPF prog-id=204 op=LOAD Jan 27 04:47:43.176000 audit[4323]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd045e428 a2=94 a3=30 items=0 ppid=4155 pid=4323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.176000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 27 04:47:43.180000 audit: BPF prog-id=205 op=LOAD Jan 27 04:47:43.180000 audit[4326]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffca31f608 a2=98 a3=ffffca31f5f8 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.180000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.180000 audit: BPF prog-id=205 op=UNLOAD Jan 27 04:47:43.180000 audit[4326]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffca31f5d8 a3=0 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.180000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.180000 audit: BPF prog-id=206 op=LOAD Jan 27 04:47:43.180000 audit[4326]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffca31f298 a2=74 a3=95 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.180000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.181000 audit: BPF prog-id=206 op=UNLOAD Jan 27 04:47:43.181000 audit[4326]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.181000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.181000 audit: BPF prog-id=207 op=LOAD Jan 27 04:47:43.181000 audit[4326]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffca31f2f8 a2=94 a3=2 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.181000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.181000 audit: BPF prog-id=207 op=UNLOAD Jan 27 04:47:43.181000 audit[4326]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.181000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.277879 containerd[1666]: time="2026-01-27T04:47:43.277825921Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:43.279773 containerd[1666]: time="2026-01-27T04:47:43.279736730Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 04:47:43.279853 containerd[1666]: time="2026-01-27T04:47:43.279816011Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:43.280019 kubelet[2905]: E0127 04:47:43.279979 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:47:43.280073 kubelet[2905]: E0127 04:47:43.280032 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:47:43.280216 kubelet[2905]: E0127 04:47:43.280153 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:43.281400 kubelet[2905]: E0127 04:47:43.281361 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:47:43.282000 audit: BPF prog-id=208 op=LOAD Jan 27 04:47:43.282000 audit[4326]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffca31f2b8 a2=40 a3=ffffca31f2e8 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.282000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.283000 audit: BPF prog-id=208 op=UNLOAD Jan 27 04:47:43.283000 audit[4326]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffca31f2e8 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.283000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.294000 audit: BPF prog-id=209 op=LOAD Jan 27 04:47:43.294000 audit[4326]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffca31f2c8 a2=94 a3=4 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.294000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.294000 audit: BPF prog-id=209 op=UNLOAD Jan 27 04:47:43.294000 audit[4326]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.294000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.294000 audit: BPF prog-id=210 op=LOAD Jan 27 04:47:43.294000 audit[4326]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffca31f108 a2=94 a3=5 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.294000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.294000 audit: BPF prog-id=210 op=UNLOAD Jan 27 04:47:43.294000 audit[4326]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.294000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.295000 audit: BPF prog-id=211 op=LOAD Jan 27 04:47:43.295000 audit[4326]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffca31f338 a2=94 a3=6 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.295000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.295000 audit: BPF prog-id=211 op=UNLOAD Jan 27 04:47:43.295000 audit[4326]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.295000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.295000 audit: BPF prog-id=212 op=LOAD Jan 27 04:47:43.295000 audit[4326]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffca31eb08 a2=94 a3=83 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.295000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.295000 audit: BPF prog-id=213 op=LOAD Jan 27 04:47:43.295000 audit[4326]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffca31e8c8 a2=94 a3=2 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.295000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.295000 audit: BPF prog-id=213 op=UNLOAD Jan 27 04:47:43.295000 audit[4326]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.295000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.296000 audit: BPF prog-id=212 op=UNLOAD Jan 27 04:47:43.296000 audit[4326]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=7a84620 a3=7a77b00 items=0 ppid=4155 pid=4326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.296000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 27 04:47:43.316000 audit: BPF prog-id=204 op=UNLOAD Jan 27 04:47:43.316000 audit[4155]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=4000900200 a2=0 a3=0 items=0 ppid=4148 pid=4155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.316000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 27 04:47:43.358000 audit[4354]: NETFILTER_CFG table=nat:121 family=2 entries=15 op=nft_register_chain pid=4354 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:43.358000 audit[4354]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffe56fe6c0 a2=0 a3=ffffa619bfa8 items=0 ppid=4155 pid=4354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.358000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:43.360000 audit[4357]: NETFILTER_CFG table=mangle:122 family=2 entries=16 op=nft_register_chain pid=4357 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:43.360000 audit[4357]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffc170e530 a2=0 a3=ffffa82effa8 items=0 ppid=4155 pid=4357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.360000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:43.367000 audit[4353]: NETFILTER_CFG table=raw:123 family=2 entries=21 op=nft_register_chain pid=4353 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:43.367000 audit[4353]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffe4468db0 a2=0 a3=ffff9a6e7fa8 items=0 ppid=4155 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.367000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:43.374000 audit[4355]: NETFILTER_CFG table=filter:124 family=2 entries=94 op=nft_register_chain pid=4355 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:43.374000 audit[4355]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=ffffeecd5010 a2=0 a3=ffff97224fa8 items=0 ppid=4155 pid=4355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.374000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:43.699282 systemd-networkd[1495]: calie61560b51f1: Gained IPv6LL Jan 27 04:47:43.827795 kubelet[2905]: I0127 04:47:43.827694 2905 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40" path="/var/lib/kubelet/pods/9d8212ef-8a8e-4c6c-9b61-a5e5cc87fb40/volumes" Jan 27 04:47:43.956031 kubelet[2905]: E0127 04:47:43.955921 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:47:43.980000 audit[4389]: NETFILTER_CFG table=filter:125 family=2 entries=20 op=nft_register_rule pid=4389 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:43.980000 audit[4389]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd950c340 a2=0 a3=1 items=0 ppid=3073 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.980000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:43.986000 audit[4389]: NETFILTER_CFG table=nat:126 family=2 entries=14 op=nft_register_rule pid=4389 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:43.986000 audit[4389]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffd950c340 a2=0 a3=1 items=0 ppid=3073 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:43.986000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:44.531375 systemd-networkd[1495]: vxlan.calico: Gained IPv6LL Jan 27 04:47:45.826348 containerd[1666]: time="2026-01-27T04:47:45.826295442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gnjg5,Uid:733d0109-2680-4e52-a103-0ca18ce93ba4,Namespace:kube-system,Attempt:0,}" Jan 27 04:47:45.934149 systemd-networkd[1495]: cali48c8d1f5184: Link UP Jan 27 04:47:45.934394 systemd-networkd[1495]: cali48c8d1f5184: Gained carrier Jan 27 04:47:45.952810 containerd[1666]: 2026-01-27 04:47:45.871 [INFO][4399] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0 coredns-668d6bf9bc- kube-system 733d0109-2680-4e52-a103-0ca18ce93ba4 829 0 2026-01-27 04:47:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4592-0-0-n-c2731c5fad coredns-668d6bf9bc-gnjg5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali48c8d1f5184 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Namespace="kube-system" Pod="coredns-668d6bf9bc-gnjg5" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-" Jan 27 04:47:45.952810 containerd[1666]: 2026-01-27 04:47:45.872 [INFO][4399] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Namespace="kube-system" Pod="coredns-668d6bf9bc-gnjg5" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" Jan 27 04:47:45.952810 containerd[1666]: 2026-01-27 04:47:45.893 [INFO][4413] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" HandleID="k8s-pod-network.c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Workload="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" Jan 27 04:47:45.953023 containerd[1666]: 2026-01-27 04:47:45.893 [INFO][4413] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" HandleID="k8s-pod-network.c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Workload="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001376e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4592-0-0-n-c2731c5fad", "pod":"coredns-668d6bf9bc-gnjg5", "timestamp":"2026-01-27 04:47:45.893485225 +0000 UTC"}, Hostname:"ci-4592-0-0-n-c2731c5fad", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 04:47:45.953023 containerd[1666]: 2026-01-27 04:47:45.893 [INFO][4413] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 04:47:45.953023 containerd[1666]: 2026-01-27 04:47:45.893 [INFO][4413] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 04:47:45.953023 containerd[1666]: 2026-01-27 04:47:45.893 [INFO][4413] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-n-c2731c5fad' Jan 27 04:47:45.953023 containerd[1666]: 2026-01-27 04:47:45.903 [INFO][4413] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:45.953023 containerd[1666]: 2026-01-27 04:47:45.908 [INFO][4413] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:45.953023 containerd[1666]: 2026-01-27 04:47:45.913 [INFO][4413] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:45.953023 containerd[1666]: 2026-01-27 04:47:45.915 [INFO][4413] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:45.953023 containerd[1666]: 2026-01-27 04:47:45.917 [INFO][4413] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:45.953375 containerd[1666]: 2026-01-27 04:47:45.918 [INFO][4413] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:45.953375 containerd[1666]: 2026-01-27 04:47:45.919 [INFO][4413] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9 Jan 27 04:47:45.953375 containerd[1666]: 2026-01-27 04:47:45.923 [INFO][4413] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:45.953375 containerd[1666]: 2026-01-27 04:47:45.930 [INFO][4413] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.2/26] block=192.168.98.0/26 handle="k8s-pod-network.c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:45.953375 containerd[1666]: 2026-01-27 04:47:45.930 [INFO][4413] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.2/26] handle="k8s-pod-network.c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:45.953375 containerd[1666]: 2026-01-27 04:47:45.930 [INFO][4413] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 04:47:45.953375 containerd[1666]: 2026-01-27 04:47:45.930 [INFO][4413] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.2/26] IPv6=[] ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" HandleID="k8s-pod-network.c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Workload="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" Jan 27 04:47:45.953512 containerd[1666]: 2026-01-27 04:47:45.932 [INFO][4399] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Namespace="kube-system" Pod="coredns-668d6bf9bc-gnjg5" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"733d0109-2680-4e52-a103-0ca18ce93ba4", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"", Pod:"coredns-668d6bf9bc-gnjg5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali48c8d1f5184", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:45.953512 containerd[1666]: 2026-01-27 04:47:45.932 [INFO][4399] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.2/32] ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Namespace="kube-system" Pod="coredns-668d6bf9bc-gnjg5" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" Jan 27 04:47:45.953512 containerd[1666]: 2026-01-27 04:47:45.932 [INFO][4399] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali48c8d1f5184 ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Namespace="kube-system" Pod="coredns-668d6bf9bc-gnjg5" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" Jan 27 04:47:45.953512 containerd[1666]: 2026-01-27 04:47:45.934 [INFO][4399] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Namespace="kube-system" Pod="coredns-668d6bf9bc-gnjg5" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" Jan 27 04:47:45.953512 containerd[1666]: 2026-01-27 04:47:45.935 [INFO][4399] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Namespace="kube-system" Pod="coredns-668d6bf9bc-gnjg5" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"733d0109-2680-4e52-a103-0ca18ce93ba4", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9", Pod:"coredns-668d6bf9bc-gnjg5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali48c8d1f5184", MAC:"22:31:ad:36:3d:7e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:45.953512 containerd[1666]: 2026-01-27 04:47:45.946 [INFO][4399] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" Namespace="kube-system" Pod="coredns-668d6bf9bc-gnjg5" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--gnjg5-eth0" Jan 27 04:47:45.961000 audit[4431]: NETFILTER_CFG table=filter:127 family=2 entries=42 op=nft_register_chain pid=4431 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:45.965529 kernel: kauditd_printk_skb: 237 callbacks suppressed Jan 27 04:47:45.965608 kernel: audit: type=1325 audit(1769489265.961:669): table=filter:127 family=2 entries=42 op=nft_register_chain pid=4431 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:45.965629 kernel: audit: type=1300 audit(1769489265.961:669): arch=c00000b7 syscall=211 success=yes exit=22552 a0=3 a1=ffffc927c0f0 a2=0 a3=ffff97d11fa8 items=0 ppid=4155 pid=4431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:45.961000 audit[4431]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=22552 a0=3 a1=ffffc927c0f0 a2=0 a3=ffff97d11fa8 items=0 ppid=4155 pid=4431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:45.961000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:45.971769 kernel: audit: type=1327 audit(1769489265.961:669): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:45.999328 containerd[1666]: time="2026-01-27T04:47:45.999247525Z" level=info msg="connecting to shim c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9" address="unix:///run/containerd/s/c26b590889b875ec2fba317f709adde278bb2134df0c47576b47278bf18077d4" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:46.044269 systemd[1]: Started cri-containerd-c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9.scope - libcontainer container c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9. Jan 27 04:47:46.061000 audit: BPF prog-id=214 op=LOAD Jan 27 04:47:46.062000 audit: BPF prog-id=215 op=LOAD Jan 27 04:47:46.064437 kernel: audit: type=1334 audit(1769489266.061:670): prog-id=214 op=LOAD Jan 27 04:47:46.064493 kernel: audit: type=1334 audit(1769489266.062:671): prog-id=215 op=LOAD Jan 27 04:47:46.064509 kernel: audit: type=1300 audit(1769489266.062:671): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=4439 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.062000 audit[4452]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=4439 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338326232323531316466323166356138666165343262383631663531 Jan 27 04:47:46.070252 kernel: audit: type=1327 audit(1769489266.062:671): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338326232323531316466323166356138666165343262383631663531 Jan 27 04:47:46.070363 kernel: audit: type=1334 audit(1769489266.062:672): prog-id=215 op=UNLOAD Jan 27 04:47:46.062000 audit: BPF prog-id=215 op=UNLOAD Jan 27 04:47:46.062000 audit[4452]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4439 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.073832 kernel: audit: type=1300 audit(1769489266.062:672): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4439 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.073863 kernel: audit: type=1327 audit(1769489266.062:672): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338326232323531316466323166356138666165343262383631663531 Jan 27 04:47:46.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338326232323531316466323166356138666165343262383631663531 Jan 27 04:47:46.062000 audit: BPF prog-id=216 op=LOAD Jan 27 04:47:46.062000 audit[4452]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=4439 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338326232323531316466323166356138666165343262383631663531 Jan 27 04:47:46.063000 audit: BPF prog-id=217 op=LOAD Jan 27 04:47:46.063000 audit[4452]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=4439 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.063000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338326232323531316466323166356138666165343262383631663531 Jan 27 04:47:46.066000 audit: BPF prog-id=217 op=UNLOAD Jan 27 04:47:46.066000 audit[4452]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4439 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338326232323531316466323166356138666165343262383631663531 Jan 27 04:47:46.066000 audit: BPF prog-id=216 op=UNLOAD Jan 27 04:47:46.066000 audit[4452]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4439 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338326232323531316466323166356138666165343262383631663531 Jan 27 04:47:46.066000 audit: BPF prog-id=218 op=LOAD Jan 27 04:47:46.066000 audit[4452]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=4439 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338326232323531316466323166356138666165343262383631663531 Jan 27 04:47:46.098716 containerd[1666]: time="2026-01-27T04:47:46.098678752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gnjg5,Uid:733d0109-2680-4e52-a103-0ca18ce93ba4,Namespace:kube-system,Attempt:0,} returns sandbox id \"c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9\"" Jan 27 04:47:46.101697 containerd[1666]: time="2026-01-27T04:47:46.101665727Z" level=info msg="CreateContainer within sandbox \"c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 27 04:47:46.142322 containerd[1666]: time="2026-01-27T04:47:46.142276014Z" level=info msg="Container 5cfb4e95d36239e3515549fd337ac5366c79d558074cd8e5a755c322df391a87: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:47:46.158462 containerd[1666]: time="2026-01-27T04:47:46.158414657Z" level=info msg="CreateContainer within sandbox \"c82b22511df21f5a8fae42b861f518e52675eb9d5f53141700e76e0d515b8db9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5cfb4e95d36239e3515549fd337ac5366c79d558074cd8e5a755c322df391a87\"" Jan 27 04:47:46.159075 containerd[1666]: time="2026-01-27T04:47:46.159044180Z" level=info msg="StartContainer for \"5cfb4e95d36239e3515549fd337ac5366c79d558074cd8e5a755c322df391a87\"" Jan 27 04:47:46.160160 containerd[1666]: time="2026-01-27T04:47:46.160137266Z" level=info msg="connecting to shim 5cfb4e95d36239e3515549fd337ac5366c79d558074cd8e5a755c322df391a87" address="unix:///run/containerd/s/c26b590889b875ec2fba317f709adde278bb2134df0c47576b47278bf18077d4" protocol=ttrpc version=3 Jan 27 04:47:46.182423 systemd[1]: Started cri-containerd-5cfb4e95d36239e3515549fd337ac5366c79d558074cd8e5a755c322df391a87.scope - libcontainer container 5cfb4e95d36239e3515549fd337ac5366c79d558074cd8e5a755c322df391a87. Jan 27 04:47:46.191000 audit: BPF prog-id=219 op=LOAD Jan 27 04:47:46.191000 audit: BPF prog-id=220 op=LOAD Jan 27 04:47:46.191000 audit[4477]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4439 pid=4477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.191000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563666234653935643336323339653335313535343966643333376163 Jan 27 04:47:46.191000 audit: BPF prog-id=220 op=UNLOAD Jan 27 04:47:46.191000 audit[4477]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4439 pid=4477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.191000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563666234653935643336323339653335313535343966643333376163 Jan 27 04:47:46.192000 audit: BPF prog-id=221 op=LOAD Jan 27 04:47:46.192000 audit[4477]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4439 pid=4477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.192000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563666234653935643336323339653335313535343966643333376163 Jan 27 04:47:46.192000 audit: BPF prog-id=222 op=LOAD Jan 27 04:47:46.192000 audit[4477]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4439 pid=4477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.192000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563666234653935643336323339653335313535343966643333376163 Jan 27 04:47:46.192000 audit: BPF prog-id=222 op=UNLOAD Jan 27 04:47:46.192000 audit[4477]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4439 pid=4477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.192000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563666234653935643336323339653335313535343966643333376163 Jan 27 04:47:46.192000 audit: BPF prog-id=221 op=UNLOAD Jan 27 04:47:46.192000 audit[4477]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4439 pid=4477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.192000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563666234653935643336323339653335313535343966643333376163 Jan 27 04:47:46.192000 audit: BPF prog-id=223 op=LOAD Jan 27 04:47:46.192000 audit[4477]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4439 pid=4477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:46.192000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563666234653935643336323339653335313535343966643333376163 Jan 27 04:47:46.207492 containerd[1666]: time="2026-01-27T04:47:46.207317906Z" level=info msg="StartContainer for \"5cfb4e95d36239e3515549fd337ac5366c79d558074cd8e5a755c322df391a87\" returns successfully" Jan 27 04:47:46.826429 containerd[1666]: time="2026-01-27T04:47:46.826380785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6svnw,Uid:22752498-52cd-4c33-8c15-87b19e76c80e,Namespace:kube-system,Attempt:0,}" Jan 27 04:47:46.826995 containerd[1666]: time="2026-01-27T04:47:46.826433385Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-trbp2,Uid:1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:46.963651 systemd-networkd[1495]: cali48c8d1f5184: Gained IPv6LL Jan 27 04:47:46.981411 kubelet[2905]: I0127 04:47:46.980965 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-gnjg5" podStartSLOduration=36.980945773 podStartE2EDuration="36.980945773s" podCreationTimestamp="2026-01-27 04:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 04:47:46.977865997 +0000 UTC m=+43.235477450" watchObservedRunningTime="2026-01-27 04:47:46.980945773 +0000 UTC m=+43.238557266" Jan 27 04:47:46.987397 systemd-networkd[1495]: cali4ccc93b2085: Link UP Jan 27 04:47:46.987972 systemd-networkd[1495]: cali4ccc93b2085: Gained carrier Jan 27 04:47:47.000000 audit[4560]: NETFILTER_CFG table=filter:128 family=2 entries=20 op=nft_register_rule pid=4560 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:47.000000 audit[4560]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe20b0c00 a2=0 a3=1 items=0 ppid=3073 pid=4560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.000000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:47.006000 audit[4560]: NETFILTER_CFG table=nat:129 family=2 entries=14 op=nft_register_rule pid=4560 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:47.006000 audit[4560]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffe20b0c00 a2=0 a3=1 items=0 ppid=3073 pid=4560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.006000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.907 [INFO][4511] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0 coredns-668d6bf9bc- kube-system 22752498-52cd-4c33-8c15-87b19e76c80e 818 0 2026-01-27 04:47:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4592-0-0-n-c2731c5fad coredns-668d6bf9bc-6svnw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4ccc93b2085 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-6svnw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.907 [INFO][4511] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-6svnw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.932 [INFO][4544] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" HandleID="k8s-pod-network.cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Workload="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.932 [INFO][4544] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" HandleID="k8s-pod-network.cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Workload="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000528890), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4592-0-0-n-c2731c5fad", "pod":"coredns-668d6bf9bc-6svnw", "timestamp":"2026-01-27 04:47:46.932073044 +0000 UTC"}, Hostname:"ci-4592-0-0-n-c2731c5fad", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.932 [INFO][4544] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.932 [INFO][4544] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.932 [INFO][4544] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-n-c2731c5fad' Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.941 [INFO][4544] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.946 [INFO][4544] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.951 [INFO][4544] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.953 [INFO][4544] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.956 [INFO][4544] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.956 [INFO][4544] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.957 [INFO][4544] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.962 [INFO][4544] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.975 [INFO][4544] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.3/26] block=192.168.98.0/26 handle="k8s-pod-network.cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.975 [INFO][4544] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.3/26] handle="k8s-pod-network.cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.975 [INFO][4544] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 04:47:47.009557 containerd[1666]: 2026-01-27 04:47:46.975 [INFO][4544] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.3/26] IPv6=[] ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" HandleID="k8s-pod-network.cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Workload="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" Jan 27 04:47:47.010045 containerd[1666]: 2026-01-27 04:47:46.982 [INFO][4511] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-6svnw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"22752498-52cd-4c33-8c15-87b19e76c80e", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"", Pod:"coredns-668d6bf9bc-6svnw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4ccc93b2085", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:47.010045 containerd[1666]: 2026-01-27 04:47:46.983 [INFO][4511] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.3/32] ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-6svnw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" Jan 27 04:47:47.010045 containerd[1666]: 2026-01-27 04:47:46.983 [INFO][4511] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4ccc93b2085 ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-6svnw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" Jan 27 04:47:47.010045 containerd[1666]: 2026-01-27 04:47:46.988 [INFO][4511] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-6svnw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" Jan 27 04:47:47.010045 containerd[1666]: 2026-01-27 04:47:46.990 [INFO][4511] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-6svnw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"22752498-52cd-4c33-8c15-87b19e76c80e", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd", Pod:"coredns-668d6bf9bc-6svnw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4ccc93b2085", MAC:"22:5f:69:36:76:79", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:47.010045 containerd[1666]: 2026-01-27 04:47:47.005 [INFO][4511] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" Namespace="kube-system" Pod="coredns-668d6bf9bc-6svnw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-coredns--668d6bf9bc--6svnw-eth0" Jan 27 04:47:47.026000 audit[4571]: NETFILTER_CFG table=filter:130 family=2 entries=17 op=nft_register_rule pid=4571 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:47.026000 audit[4571]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffec011ca0 a2=0 a3=1 items=0 ppid=3073 pid=4571 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.026000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:47.033000 audit[4571]: NETFILTER_CFG table=nat:131 family=2 entries=35 op=nft_register_chain pid=4571 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:47.033000 audit[4571]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffec011ca0 a2=0 a3=1 items=0 ppid=3073 pid=4571 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.033000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:47.033000 audit[4572]: NETFILTER_CFG table=filter:132 family=2 entries=42 op=nft_register_chain pid=4572 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:47.033000 audit[4572]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=22008 a0=3 a1=ffffceb8c090 a2=0 a3=ffff82f52fa8 items=0 ppid=4155 pid=4572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.033000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:47.048204 containerd[1666]: time="2026-01-27T04:47:47.047634553Z" level=info msg="connecting to shim cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd" address="unix:///run/containerd/s/6876143b5a7dde8e7d8d22870290a2fbad27402588bc8565a017fde11957be99" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:47.080354 systemd[1]: Started cri-containerd-cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd.scope - libcontainer container cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd. Jan 27 04:47:47.086808 systemd-networkd[1495]: calib0e9608d1d0: Link UP Jan 27 04:47:47.087565 systemd-networkd[1495]: calib0e9608d1d0: Gained carrier Jan 27 04:47:47.092000 audit: BPF prog-id=224 op=LOAD Jan 27 04:47:47.092000 audit: BPF prog-id=225 op=LOAD Jan 27 04:47:47.092000 audit[4593]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4581 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362313963666138303432666438333963386435333030623139356461 Jan 27 04:47:47.092000 audit: BPF prog-id=225 op=UNLOAD Jan 27 04:47:47.092000 audit[4593]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4581 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362313963666138303432666438333963386435333030623139356461 Jan 27 04:47:47.092000 audit: BPF prog-id=226 op=LOAD Jan 27 04:47:47.092000 audit[4593]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4581 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.092000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362313963666138303432666438333963386435333030623139356461 Jan 27 04:47:47.093000 audit: BPF prog-id=227 op=LOAD Jan 27 04:47:47.093000 audit[4593]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4581 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.093000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362313963666138303432666438333963386435333030623139356461 Jan 27 04:47:47.093000 audit: BPF prog-id=227 op=UNLOAD Jan 27 04:47:47.093000 audit[4593]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4581 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.093000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362313963666138303432666438333963386435333030623139356461 Jan 27 04:47:47.093000 audit: BPF prog-id=226 op=UNLOAD Jan 27 04:47:47.093000 audit[4593]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4581 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.093000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362313963666138303432666438333963386435333030623139356461 Jan 27 04:47:47.093000 audit: BPF prog-id=228 op=LOAD Jan 27 04:47:47.093000 audit[4593]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4581 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.093000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362313963666138303432666438333963386435333030623139356461 Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:46.907 [INFO][4523] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0 goldmane-666569f655- calico-system 1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b 826 0 2026-01-27 04:47:24 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4592-0-0-n-c2731c5fad goldmane-666569f655-trbp2 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib0e9608d1d0 [] [] }} ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Namespace="calico-system" Pod="goldmane-666569f655-trbp2" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:46.908 [INFO][4523] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Namespace="calico-system" Pod="goldmane-666569f655-trbp2" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:46.935 [INFO][4543] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" HandleID="k8s-pod-network.4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Workload="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:46.935 [INFO][4543] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" HandleID="k8s-pod-network.4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Workload="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137db0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4592-0-0-n-c2731c5fad", "pod":"goldmane-666569f655-trbp2", "timestamp":"2026-01-27 04:47:46.935725622 +0000 UTC"}, Hostname:"ci-4592-0-0-n-c2731c5fad", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:46.935 [INFO][4543] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:46.976 [INFO][4543] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:46.976 [INFO][4543] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-n-c2731c5fad' Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.043 [INFO][4543] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.054 [INFO][4543] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.059 [INFO][4543] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.062 [INFO][4543] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.064 [INFO][4543] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.065 [INFO][4543] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.066 [INFO][4543] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74 Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.072 [INFO][4543] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.078 [INFO][4543] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.4/26] block=192.168.98.0/26 handle="k8s-pod-network.4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.078 [INFO][4543] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.4/26] handle="k8s-pod-network.4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.078 [INFO][4543] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 04:47:47.114718 containerd[1666]: 2026-01-27 04:47:47.078 [INFO][4543] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.4/26] IPv6=[] ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" HandleID="k8s-pod-network.4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Workload="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" Jan 27 04:47:47.115241 containerd[1666]: 2026-01-27 04:47:47.082 [INFO][4523] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Namespace="calico-system" Pod="goldmane-666569f655-trbp2" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"", Pod:"goldmane-666569f655-trbp2", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib0e9608d1d0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:47.115241 containerd[1666]: 2026-01-27 04:47:47.082 [INFO][4523] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.4/32] ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Namespace="calico-system" Pod="goldmane-666569f655-trbp2" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" Jan 27 04:47:47.115241 containerd[1666]: 2026-01-27 04:47:47.083 [INFO][4523] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib0e9608d1d0 ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Namespace="calico-system" Pod="goldmane-666569f655-trbp2" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" Jan 27 04:47:47.115241 containerd[1666]: 2026-01-27 04:47:47.087 [INFO][4523] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Namespace="calico-system" Pod="goldmane-666569f655-trbp2" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" Jan 27 04:47:47.115241 containerd[1666]: 2026-01-27 04:47:47.089 [INFO][4523] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Namespace="calico-system" Pod="goldmane-666569f655-trbp2" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74", Pod:"goldmane-666569f655-trbp2", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.98.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib0e9608d1d0", MAC:"32:7d:76:ff:b8:31", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:47.115241 containerd[1666]: 2026-01-27 04:47:47.110 [INFO][4523] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" Namespace="calico-system" Pod="goldmane-666569f655-trbp2" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-goldmane--666569f655--trbp2-eth0" Jan 27 04:47:47.124404 containerd[1666]: time="2026-01-27T04:47:47.124355665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6svnw,Uid:22752498-52cd-4c33-8c15-87b19e76c80e,Namespace:kube-system,Attempt:0,} returns sandbox id \"cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd\"" Jan 27 04:47:47.128281 containerd[1666]: time="2026-01-27T04:47:47.128117204Z" level=info msg="CreateContainer within sandbox \"cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 27 04:47:47.126000 audit[4628]: NETFILTER_CFG table=filter:133 family=2 entries=48 op=nft_register_chain pid=4628 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:47.126000 audit[4628]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26352 a0=3 a1=ffffd3372f90 a2=0 a3=ffff9f697fa8 items=0 ppid=4155 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.126000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:47.151025 containerd[1666]: time="2026-01-27T04:47:47.150972561Z" level=info msg="Container 840a2867076907e79c1a453339322b4770f887d0e19994f909c1913134715958: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:47:47.152213 containerd[1666]: time="2026-01-27T04:47:47.152174247Z" level=info msg="connecting to shim 4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74" address="unix:///run/containerd/s/de674d8d4d81a73bfe14b1eddd45cceef06a3bcd6dbabf1ab1b2381e40ad73a0" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:47.170134 containerd[1666]: time="2026-01-27T04:47:47.170068018Z" level=info msg="CreateContainer within sandbox \"cb19cfa8042fd839c8d5300b195da224663a28e58cb45e3849c02e1a2d2700dd\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"840a2867076907e79c1a453339322b4770f887d0e19994f909c1913134715958\"" Jan 27 04:47:47.170543 containerd[1666]: time="2026-01-27T04:47:47.170514780Z" level=info msg="StartContainer for \"840a2867076907e79c1a453339322b4770f887d0e19994f909c1913134715958\"" Jan 27 04:47:47.171743 containerd[1666]: time="2026-01-27T04:47:47.171692466Z" level=info msg="connecting to shim 840a2867076907e79c1a453339322b4770f887d0e19994f909c1913134715958" address="unix:///run/containerd/s/6876143b5a7dde8e7d8d22870290a2fbad27402588bc8565a017fde11957be99" protocol=ttrpc version=3 Jan 27 04:47:47.175358 systemd[1]: Started cri-containerd-4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74.scope - libcontainer container 4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74. Jan 27 04:47:47.209357 systemd[1]: Started cri-containerd-840a2867076907e79c1a453339322b4770f887d0e19994f909c1913134715958.scope - libcontainer container 840a2867076907e79c1a453339322b4770f887d0e19994f909c1913134715958. Jan 27 04:47:47.210000 audit: BPF prog-id=229 op=LOAD Jan 27 04:47:47.211000 audit: BPF prog-id=230 op=LOAD Jan 27 04:47:47.211000 audit[4650]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4638 pid=4650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465366135656438303966623362373332643437313838323533356465 Jan 27 04:47:47.211000 audit: BPF prog-id=230 op=UNLOAD Jan 27 04:47:47.211000 audit[4650]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4638 pid=4650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465366135656438303966623362373332643437313838323533356465 Jan 27 04:47:47.212000 audit: BPF prog-id=231 op=LOAD Jan 27 04:47:47.212000 audit[4650]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4638 pid=4650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.212000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465366135656438303966623362373332643437313838323533356465 Jan 27 04:47:47.212000 audit: BPF prog-id=232 op=LOAD Jan 27 04:47:47.212000 audit[4650]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4638 pid=4650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.212000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465366135656438303966623362373332643437313838323533356465 Jan 27 04:47:47.212000 audit: BPF prog-id=232 op=UNLOAD Jan 27 04:47:47.212000 audit[4650]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4638 pid=4650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.212000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465366135656438303966623362373332643437313838323533356465 Jan 27 04:47:47.212000 audit: BPF prog-id=231 op=UNLOAD Jan 27 04:47:47.212000 audit[4650]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4638 pid=4650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.212000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465366135656438303966623362373332643437313838323533356465 Jan 27 04:47:47.213000 audit: BPF prog-id=233 op=LOAD Jan 27 04:47:47.213000 audit[4650]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4638 pid=4650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.213000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465366135656438303966623362373332643437313838323533356465 Jan 27 04:47:47.218000 audit: BPF prog-id=234 op=LOAD Jan 27 04:47:47.218000 audit: BPF prog-id=235 op=LOAD Jan 27 04:47:47.218000 audit[4662]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4581 pid=4662 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.218000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3834306132383637303736393037653739633161343533333339333232 Jan 27 04:47:47.219000 audit: BPF prog-id=235 op=UNLOAD Jan 27 04:47:47.219000 audit[4662]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4581 pid=4662 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.219000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3834306132383637303736393037653739633161343533333339333232 Jan 27 04:47:47.219000 audit: BPF prog-id=236 op=LOAD Jan 27 04:47:47.219000 audit[4662]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4581 pid=4662 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.219000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3834306132383637303736393037653739633161343533333339333232 Jan 27 04:47:47.219000 audit: BPF prog-id=237 op=LOAD Jan 27 04:47:47.219000 audit[4662]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4581 pid=4662 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.219000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3834306132383637303736393037653739633161343533333339333232 Jan 27 04:47:47.219000 audit: BPF prog-id=237 op=UNLOAD Jan 27 04:47:47.219000 audit[4662]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4581 pid=4662 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.219000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3834306132383637303736393037653739633161343533333339333232 Jan 27 04:47:47.219000 audit: BPF prog-id=236 op=UNLOAD Jan 27 04:47:47.219000 audit[4662]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4581 pid=4662 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.219000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3834306132383637303736393037653739633161343533333339333232 Jan 27 04:47:47.219000 audit: BPF prog-id=238 op=LOAD Jan 27 04:47:47.219000 audit[4662]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4581 pid=4662 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.219000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3834306132383637303736393037653739633161343533333339333232 Jan 27 04:47:47.259511 containerd[1666]: time="2026-01-27T04:47:47.259471674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-trbp2,Uid:1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b,Namespace:calico-system,Attempt:0,} returns sandbox id \"4e6a5ed809fb3b732d471882535dec0cf4dfd3556d2bcd5870925635184eea74\"" Jan 27 04:47:47.261370 containerd[1666]: time="2026-01-27T04:47:47.261312924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 04:47:47.268288 containerd[1666]: time="2026-01-27T04:47:47.268255319Z" level=info msg="StartContainer for \"840a2867076907e79c1a453339322b4770f887d0e19994f909c1913134715958\" returns successfully" Jan 27 04:47:47.605188 containerd[1666]: time="2026-01-27T04:47:47.605125278Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:47.607592 containerd[1666]: time="2026-01-27T04:47:47.607517890Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 04:47:47.607672 containerd[1666]: time="2026-01-27T04:47:47.607568210Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:47.607967 kubelet[2905]: E0127 04:47:47.607913 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:47:47.607967 kubelet[2905]: E0127 04:47:47.607962 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:47:47.608427 kubelet[2905]: E0127 04:47:47.608344 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zbn74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-trbp2_calico-system(1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:47.609600 kubelet[2905]: E0127 04:47:47.609539 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:47:47.826641 containerd[1666]: time="2026-01-27T04:47:47.826323566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b5b8d765d-qg4kw,Uid:e725b346-d7db-48ca-8580-5074b068cd87,Namespace:calico-apiserver,Attempt:0,}" Jan 27 04:47:47.826641 containerd[1666]: time="2026-01-27T04:47:47.826410367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-799789d486-wpdhm,Uid:5edc651f-3273-46b9-a554-3e38c11ea910,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:47.950797 systemd-networkd[1495]: cali8691599140b: Link UP Jan 27 04:47:47.951576 systemd-networkd[1495]: cali8691599140b: Gained carrier Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.882 [INFO][4720] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0 calico-kube-controllers-799789d486- calico-system 5edc651f-3273-46b9-a554-3e38c11ea910 823 0 2026-01-27 04:47:26 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:799789d486 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4592-0-0-n-c2731c5fad calico-kube-controllers-799789d486-wpdhm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8691599140b [] [] }} ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Namespace="calico-system" Pod="calico-kube-controllers-799789d486-wpdhm" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.882 [INFO][4720] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Namespace="calico-system" Pod="calico-kube-controllers-799789d486-wpdhm" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.908 [INFO][4738] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" HandleID="k8s-pod-network.19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Workload="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.908 [INFO][4738] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" HandleID="k8s-pod-network.19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Workload="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c790), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4592-0-0-n-c2731c5fad", "pod":"calico-kube-controllers-799789d486-wpdhm", "timestamp":"2026-01-27 04:47:47.908772427 +0000 UTC"}, Hostname:"ci-4592-0-0-n-c2731c5fad", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.908 [INFO][4738] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.909 [INFO][4738] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.909 [INFO][4738] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-n-c2731c5fad' Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.917 [INFO][4738] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.922 [INFO][4738] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.926 [INFO][4738] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.928 [INFO][4738] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.930 [INFO][4738] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.930 [INFO][4738] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.931 [INFO][4738] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200 Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.936 [INFO][4738] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.943 [INFO][4738] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.5/26] block=192.168.98.0/26 handle="k8s-pod-network.19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.943 [INFO][4738] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.5/26] handle="k8s-pod-network.19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.943 [INFO][4738] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 04:47:47.974685 containerd[1666]: 2026-01-27 04:47:47.943 [INFO][4738] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.5/26] IPv6=[] ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" HandleID="k8s-pod-network.19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Workload="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" Jan 27 04:47:47.977686 containerd[1666]: 2026-01-27 04:47:47.945 [INFO][4720] cni-plugin/k8s.go 418: Populated endpoint ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Namespace="calico-system" Pod="calico-kube-controllers-799789d486-wpdhm" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0", GenerateName:"calico-kube-controllers-799789d486-", Namespace:"calico-system", SelfLink:"", UID:"5edc651f-3273-46b9-a554-3e38c11ea910", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"799789d486", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"", Pod:"calico-kube-controllers-799789d486-wpdhm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8691599140b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:47.977686 containerd[1666]: 2026-01-27 04:47:47.946 [INFO][4720] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.5/32] ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Namespace="calico-system" Pod="calico-kube-controllers-799789d486-wpdhm" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" Jan 27 04:47:47.977686 containerd[1666]: 2026-01-27 04:47:47.946 [INFO][4720] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8691599140b ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Namespace="calico-system" Pod="calico-kube-controllers-799789d486-wpdhm" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" Jan 27 04:47:47.977686 containerd[1666]: 2026-01-27 04:47:47.952 [INFO][4720] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Namespace="calico-system" Pod="calico-kube-controllers-799789d486-wpdhm" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" Jan 27 04:47:47.977686 containerd[1666]: 2026-01-27 04:47:47.952 [INFO][4720] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Namespace="calico-system" Pod="calico-kube-controllers-799789d486-wpdhm" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0", GenerateName:"calico-kube-controllers-799789d486-", Namespace:"calico-system", SelfLink:"", UID:"5edc651f-3273-46b9-a554-3e38c11ea910", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"799789d486", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200", Pod:"calico-kube-controllers-799789d486-wpdhm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8691599140b", MAC:"fe:41:d3:a2:74:e7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:47.977686 containerd[1666]: 2026-01-27 04:47:47.967 [INFO][4720] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" Namespace="calico-system" Pod="calico-kube-controllers-799789d486-wpdhm" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--kube--controllers--799789d486--wpdhm-eth0" Jan 27 04:47:47.978904 kubelet[2905]: E0127 04:47:47.978072 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:47:47.980000 audit[4764]: NETFILTER_CFG table=filter:134 family=2 entries=44 op=nft_register_chain pid=4764 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:47.980000 audit[4764]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21936 a0=3 a1=ffffe256b860 a2=0 a3=ffff826f5fa8 items=0 ppid=4155 pid=4764 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:47.980000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:48.004625 kubelet[2905]: I0127 04:47:48.004196 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-6svnw" podStartSLOduration=38.004178673 podStartE2EDuration="38.004178673s" podCreationTimestamp="2026-01-27 04:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 04:47:48.003935032 +0000 UTC m=+44.261546485" watchObservedRunningTime="2026-01-27 04:47:48.004178673 +0000 UTC m=+44.261790126" Jan 27 04:47:48.011000 audit[4766]: NETFILTER_CFG table=filter:135 family=2 entries=14 op=nft_register_rule pid=4766 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:48.011000 audit[4766]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe402a060 a2=0 a3=1 items=0 ppid=3073 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.011000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:48.016000 audit[4766]: NETFILTER_CFG table=nat:136 family=2 entries=20 op=nft_register_rule pid=4766 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:48.016000 audit[4766]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffe402a060 a2=0 a3=1 items=0 ppid=3073 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.016000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:48.027453 containerd[1666]: time="2026-01-27T04:47:48.027342712Z" level=info msg="connecting to shim 19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200" address="unix:///run/containerd/s/6d62b8fb07b64829073252298f48abf07a6ee66148f635dd7452e58200ea3709" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:48.053339 systemd[1]: Started cri-containerd-19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200.scope - libcontainer container 19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200. Jan 27 04:47:48.068302 systemd-networkd[1495]: cali1a29945bb9d: Link UP Jan 27 04:47:48.068978 systemd-networkd[1495]: cali1a29945bb9d: Gained carrier Jan 27 04:47:48.070000 audit: BPF prog-id=239 op=LOAD Jan 27 04:47:48.071000 audit: BPF prog-id=240 op=LOAD Jan 27 04:47:48.071000 audit[4788]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4777 pid=4788 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.071000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613334363339373364663339633034396438363831316236313661 Jan 27 04:47:48.071000 audit: BPF prog-id=240 op=UNLOAD Jan 27 04:47:48.071000 audit[4788]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4777 pid=4788 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.071000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613334363339373364663339633034396438363831316236313661 Jan 27 04:47:48.073000 audit: BPF prog-id=241 op=LOAD Jan 27 04:47:48.073000 audit[4788]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4777 pid=4788 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.073000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613334363339373364663339633034396438363831316236313661 Jan 27 04:47:48.073000 audit: BPF prog-id=242 op=LOAD Jan 27 04:47:48.073000 audit[4788]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4777 pid=4788 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.073000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613334363339373364663339633034396438363831316236313661 Jan 27 04:47:48.073000 audit: BPF prog-id=242 op=UNLOAD Jan 27 04:47:48.073000 audit[4788]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4777 pid=4788 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.073000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613334363339373364663339633034396438363831316236313661 Jan 27 04:47:48.073000 audit: BPF prog-id=241 op=UNLOAD Jan 27 04:47:48.073000 audit[4788]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4777 pid=4788 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.073000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613334363339373364663339633034396438363831316236313661 Jan 27 04:47:48.073000 audit: BPF prog-id=243 op=LOAD Jan 27 04:47:48.073000 audit[4788]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4777 pid=4788 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.073000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3139613334363339373364663339633034396438363831316236313661 Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:47.879 [INFO][4709] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0 calico-apiserver-b5b8d765d- calico-apiserver e725b346-d7db-48ca-8580-5074b068cd87 828 0 2026-01-27 04:47:19 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b5b8d765d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4592-0-0-n-c2731c5fad calico-apiserver-b5b8d765d-qg4kw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1a29945bb9d [] [] }} ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-qg4kw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:47.879 [INFO][4709] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-qg4kw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:47.909 [INFO][4745] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" HandleID="k8s-pod-network.85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Workload="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:47.910 [INFO][4745] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" HandleID="k8s-pod-network.85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Workload="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3db0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4592-0-0-n-c2731c5fad", "pod":"calico-apiserver-b5b8d765d-qg4kw", "timestamp":"2026-01-27 04:47:47.909793112 +0000 UTC"}, Hostname:"ci-4592-0-0-n-c2731c5fad", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:47.910 [INFO][4745] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:47.943 [INFO][4745] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:47.944 [INFO][4745] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-n-c2731c5fad' Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.022 [INFO][4745] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.031 [INFO][4745] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.038 [INFO][4745] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.040 [INFO][4745] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.043 [INFO][4745] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.044 [INFO][4745] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.047 [INFO][4745] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7 Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.052 [INFO][4745] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.060 [INFO][4745] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.6/26] block=192.168.98.0/26 handle="k8s-pod-network.85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.060 [INFO][4745] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.6/26] handle="k8s-pod-network.85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.060 [INFO][4745] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 04:47:48.081581 containerd[1666]: 2026-01-27 04:47:48.060 [INFO][4745] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.6/26] IPv6=[] ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" HandleID="k8s-pod-network.85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Workload="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" Jan 27 04:47:48.082292 containerd[1666]: 2026-01-27 04:47:48.065 [INFO][4709] cni-plugin/k8s.go 418: Populated endpoint ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-qg4kw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0", GenerateName:"calico-apiserver-b5b8d765d-", Namespace:"calico-apiserver", SelfLink:"", UID:"e725b346-d7db-48ca-8580-5074b068cd87", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b5b8d765d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"", Pod:"calico-apiserver-b5b8d765d-qg4kw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1a29945bb9d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:48.082292 containerd[1666]: 2026-01-27 04:47:48.065 [INFO][4709] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.6/32] ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-qg4kw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" Jan 27 04:47:48.082292 containerd[1666]: 2026-01-27 04:47:48.066 [INFO][4709] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1a29945bb9d ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-qg4kw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" Jan 27 04:47:48.082292 containerd[1666]: 2026-01-27 04:47:48.067 [INFO][4709] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-qg4kw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" Jan 27 04:47:48.082292 containerd[1666]: 2026-01-27 04:47:48.068 [INFO][4709] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-qg4kw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0", GenerateName:"calico-apiserver-b5b8d765d-", Namespace:"calico-apiserver", SelfLink:"", UID:"e725b346-d7db-48ca-8580-5074b068cd87", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b5b8d765d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7", Pod:"calico-apiserver-b5b8d765d-qg4kw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1a29945bb9d", MAC:"9a:91:fa:94:6a:2e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:48.082292 containerd[1666]: 2026-01-27 04:47:48.079 [INFO][4709] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-qg4kw" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--qg4kw-eth0" Jan 27 04:47:48.091000 audit[4815]: NETFILTER_CFG table=filter:137 family=2 entries=62 op=nft_register_chain pid=4815 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:48.091000 audit[4815]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=31756 a0=3 a1=ffffcf5fa820 a2=0 a3=ffff9de8afa8 items=0 ppid=4155 pid=4815 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.091000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:48.108805 containerd[1666]: time="2026-01-27T04:47:48.108767887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-799789d486-wpdhm,Uid:5edc651f-3273-46b9-a554-3e38c11ea910,Namespace:calico-system,Attempt:0,} returns sandbox id \"19a3463973df39c049d86811b616ac5c28284835398bc6182490b6f35f987200\"" Jan 27 04:47:48.110316 containerd[1666]: time="2026-01-27T04:47:48.110285695Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 04:47:48.141676 containerd[1666]: time="2026-01-27T04:47:48.141618215Z" level=info msg="connecting to shim 85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7" address="unix:///run/containerd/s/ad8c6b3977878563e42385e3c94fc8cbefb12199aaab5577e8ee928813263f57" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:48.165337 systemd[1]: Started cri-containerd-85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7.scope - libcontainer container 85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7. Jan 27 04:47:48.179000 audit: BPF prog-id=244 op=LOAD Jan 27 04:47:48.179000 audit: BPF prog-id=245 op=LOAD Jan 27 04:47:48.179000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4830 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835623365393938373161336635616630663566333431313462316630 Jan 27 04:47:48.179000 audit: BPF prog-id=245 op=UNLOAD Jan 27 04:47:48.179000 audit[4842]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4830 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835623365393938373161336635616630663566333431313462316630 Jan 27 04:47:48.179000 audit: BPF prog-id=246 op=LOAD Jan 27 04:47:48.179000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4830 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835623365393938373161336635616630663566333431313462316630 Jan 27 04:47:48.180000 audit: BPF prog-id=247 op=LOAD Jan 27 04:47:48.180000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4830 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835623365393938373161336635616630663566333431313462316630 Jan 27 04:47:48.180000 audit: BPF prog-id=247 op=UNLOAD Jan 27 04:47:48.180000 audit[4842]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4830 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835623365393938373161336635616630663566333431313462316630 Jan 27 04:47:48.180000 audit: BPF prog-id=246 op=UNLOAD Jan 27 04:47:48.180000 audit[4842]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4830 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835623365393938373161336635616630663566333431313462316630 Jan 27 04:47:48.180000 audit: BPF prog-id=248 op=LOAD Jan 27 04:47:48.180000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4830 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:48.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3835623365393938373161336635616630663566333431313462316630 Jan 27 04:47:48.204105 containerd[1666]: time="2026-01-27T04:47:48.203976133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b5b8d765d-qg4kw,Uid:e725b346-d7db-48ca-8580-5074b068cd87,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"85b3e99871a3f5af0f5f34114b1f027d128b8cf2944fec35046ce0f7fa893ac7\"" Jan 27 04:47:48.445999 containerd[1666]: time="2026-01-27T04:47:48.445939807Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:48.448006 containerd[1666]: time="2026-01-27T04:47:48.447947938Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 04:47:48.448085 containerd[1666]: time="2026-01-27T04:47:48.448040418Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:48.448282 kubelet[2905]: E0127 04:47:48.448223 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:47:48.448282 kubelet[2905]: E0127 04:47:48.448272 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:47:48.448788 kubelet[2905]: E0127 04:47:48.448519 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmpx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-799789d486-wpdhm_calico-system(5edc651f-3273-46b9-a554-3e38c11ea910): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:48.448922 containerd[1666]: time="2026-01-27T04:47:48.448645621Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:47:48.449831 kubelet[2905]: E0127 04:47:48.449767 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:47:48.776572 containerd[1666]: time="2026-01-27T04:47:48.776514774Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:48.778579 containerd[1666]: time="2026-01-27T04:47:48.778522624Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:47:48.778706 containerd[1666]: time="2026-01-27T04:47:48.778576664Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:48.778893 kubelet[2905]: E0127 04:47:48.778856 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:47:48.779152 kubelet[2905]: E0127 04:47:48.779058 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:47:48.779410 kubelet[2905]: E0127 04:47:48.779366 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tvs85,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-qg4kw_calico-apiserver(e725b346-d7db-48ca-8580-5074b068cd87): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:48.780867 kubelet[2905]: E0127 04:47:48.780817 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:47:48.819263 systemd-networkd[1495]: cali4ccc93b2085: Gained IPv6LL Jan 27 04:47:48.883395 systemd-networkd[1495]: calib0e9608d1d0: Gained IPv6LL Jan 27 04:47:48.983087 kubelet[2905]: E0127 04:47:48.983044 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:47:48.983881 kubelet[2905]: E0127 04:47:48.983198 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:47:48.983974 kubelet[2905]: E0127 04:47:48.983909 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:47:49.005000 audit[4874]: NETFILTER_CFG table=filter:138 family=2 entries=14 op=nft_register_rule pid=4874 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:49.005000 audit[4874]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffee9ca610 a2=0 a3=1 items=0 ppid=3073 pid=4874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:49.005000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:49.023000 audit[4874]: NETFILTER_CFG table=nat:139 family=2 entries=56 op=nft_register_chain pid=4874 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:49.023000 audit[4874]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffee9ca610 a2=0 a3=1 items=0 ppid=3073 pid=4874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:49.023000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:49.395288 systemd-networkd[1495]: cali1a29945bb9d: Gained IPv6LL Jan 27 04:47:49.825978 containerd[1666]: time="2026-01-27T04:47:49.825938488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vk94b,Uid:56e3e6d0-7a6b-4ba3-9081-3231ea811709,Namespace:calico-system,Attempt:0,}" Jan 27 04:47:49.908525 systemd-networkd[1495]: cali8691599140b: Gained IPv6LL Jan 27 04:47:49.925579 systemd-networkd[1495]: calif709bb95935: Link UP Jan 27 04:47:49.926115 systemd-networkd[1495]: calif709bb95935: Gained carrier Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.860 [INFO][4876] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0 csi-node-driver- calico-system 56e3e6d0-7a6b-4ba3-9081-3231ea811709 736 0 2026-01-27 04:47:26 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4592-0-0-n-c2731c5fad csi-node-driver-vk94b eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif709bb95935 [] [] }} ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Namespace="calico-system" Pod="csi-node-driver-vk94b" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.860 [INFO][4876] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Namespace="calico-system" Pod="csi-node-driver-vk94b" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.883 [INFO][4891] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" HandleID="k8s-pod-network.dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Workload="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.883 [INFO][4891] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" HandleID="k8s-pod-network.dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Workload="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137b30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4592-0-0-n-c2731c5fad", "pod":"csi-node-driver-vk94b", "timestamp":"2026-01-27 04:47:49.883767903 +0000 UTC"}, Hostname:"ci-4592-0-0-n-c2731c5fad", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.883 [INFO][4891] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.884 [INFO][4891] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.884 [INFO][4891] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-n-c2731c5fad' Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.894 [INFO][4891] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.898 [INFO][4891] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.903 [INFO][4891] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.904 [INFO][4891] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.907 [INFO][4891] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.907 [INFO][4891] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.909 [INFO][4891] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151 Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.913 [INFO][4891] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.921 [INFO][4891] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.7/26] block=192.168.98.0/26 handle="k8s-pod-network.dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.921 [INFO][4891] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.7/26] handle="k8s-pod-network.dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.921 [INFO][4891] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 04:47:49.941307 containerd[1666]: 2026-01-27 04:47:49.921 [INFO][4891] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.7/26] IPv6=[] ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" HandleID="k8s-pod-network.dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Workload="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" Jan 27 04:47:49.941833 containerd[1666]: 2026-01-27 04:47:49.923 [INFO][4876] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Namespace="calico-system" Pod="csi-node-driver-vk94b" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"56e3e6d0-7a6b-4ba3-9081-3231ea811709", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"", Pod:"csi-node-driver-vk94b", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif709bb95935", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:49.941833 containerd[1666]: 2026-01-27 04:47:49.923 [INFO][4876] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.7/32] ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Namespace="calico-system" Pod="csi-node-driver-vk94b" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" Jan 27 04:47:49.941833 containerd[1666]: 2026-01-27 04:47:49.923 [INFO][4876] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif709bb95935 ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Namespace="calico-system" Pod="csi-node-driver-vk94b" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" Jan 27 04:47:49.941833 containerd[1666]: 2026-01-27 04:47:49.926 [INFO][4876] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Namespace="calico-system" Pod="csi-node-driver-vk94b" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" Jan 27 04:47:49.941833 containerd[1666]: 2026-01-27 04:47:49.926 [INFO][4876] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Namespace="calico-system" Pod="csi-node-driver-vk94b" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"56e3e6d0-7a6b-4ba3-9081-3231ea811709", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151", Pod:"csi-node-driver-vk94b", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif709bb95935", MAC:"22:28:70:83:c4:91", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:49.941833 containerd[1666]: 2026-01-27 04:47:49.938 [INFO][4876] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" Namespace="calico-system" Pod="csi-node-driver-vk94b" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-csi--node--driver--vk94b-eth0" Jan 27 04:47:49.954000 audit[4909]: NETFILTER_CFG table=filter:140 family=2 entries=52 op=nft_register_chain pid=4909 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:49.954000 audit[4909]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24312 a0=3 a1=ffffd34e1170 a2=0 a3=ffffb5b22fa8 items=0 ppid=4155 pid=4909 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:49.954000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:49.988334 containerd[1666]: time="2026-01-27T04:47:49.987253791Z" level=info msg="connecting to shim dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151" address="unix:///run/containerd/s/24a55259e7b8edf30bb9dddd7eafc241d6218b0a3b9047dbfd8a2c4da62922af" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:49.988614 kubelet[2905]: E0127 04:47:49.987335 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:47:49.988614 kubelet[2905]: E0127 04:47:49.987547 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:47:50.028535 systemd[1]: Started cri-containerd-dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151.scope - libcontainer container dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151. Jan 27 04:47:50.039000 audit: BPF prog-id=249 op=LOAD Jan 27 04:47:50.040000 audit: BPF prog-id=250 op=LOAD Jan 27 04:47:50.040000 audit[4929]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4918 pid=4929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:50.040000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464363963396361313566613436613363386666316263323233613766 Jan 27 04:47:50.040000 audit: BPF prog-id=250 op=UNLOAD Jan 27 04:47:50.040000 audit[4929]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4918 pid=4929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:50.040000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464363963396361313566613436613363386666316263323233613766 Jan 27 04:47:50.040000 audit: BPF prog-id=251 op=LOAD Jan 27 04:47:50.040000 audit[4929]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4918 pid=4929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:50.040000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464363963396361313566613436613363386666316263323233613766 Jan 27 04:47:50.041000 audit: BPF prog-id=252 op=LOAD Jan 27 04:47:50.041000 audit[4929]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4918 pid=4929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:50.041000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464363963396361313566613436613363386666316263323233613766 Jan 27 04:47:50.041000 audit: BPF prog-id=252 op=UNLOAD Jan 27 04:47:50.041000 audit[4929]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4918 pid=4929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:50.041000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464363963396361313566613436613363386666316263323233613766 Jan 27 04:47:50.041000 audit: BPF prog-id=251 op=UNLOAD Jan 27 04:47:50.041000 audit[4929]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4918 pid=4929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:50.041000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464363963396361313566613436613363386666316263323233613766 Jan 27 04:47:50.041000 audit: BPF prog-id=253 op=LOAD Jan 27 04:47:50.041000 audit[4929]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4918 pid=4929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:50.041000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6464363963396361313566613436613363386666316263323233613766 Jan 27 04:47:50.061409 containerd[1666]: time="2026-01-27T04:47:50.061253288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vk94b,Uid:56e3e6d0-7a6b-4ba3-9081-3231ea811709,Namespace:calico-system,Attempt:0,} returns sandbox id \"dd69c9ca15fa46a3c8ff1bc223a7fd7f9433a26240455d645b75e8dcb42fc151\"" Jan 27 04:47:50.063848 containerd[1666]: time="2026-01-27T04:47:50.063811821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 04:47:50.397387 containerd[1666]: time="2026-01-27T04:47:50.397319643Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:50.398976 containerd[1666]: time="2026-01-27T04:47:50.398936371Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 04:47:50.399246 containerd[1666]: time="2026-01-27T04:47:50.399009892Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:50.399474 kubelet[2905]: E0127 04:47:50.399149 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:47:50.399474 kubelet[2905]: E0127 04:47:50.399190 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:47:50.399474 kubelet[2905]: E0127 04:47:50.399297 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:50.401269 containerd[1666]: time="2026-01-27T04:47:50.401242943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 04:47:50.737704 containerd[1666]: time="2026-01-27T04:47:50.737603859Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:50.739046 containerd[1666]: time="2026-01-27T04:47:50.739000466Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 04:47:50.739650 containerd[1666]: time="2026-01-27T04:47:50.739083427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:50.739728 kubelet[2905]: E0127 04:47:50.739277 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:47:50.739728 kubelet[2905]: E0127 04:47:50.739319 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:47:50.739728 kubelet[2905]: E0127 04:47:50.739435 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:50.740832 kubelet[2905]: E0127 04:47:50.740795 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:47:50.825338 containerd[1666]: time="2026-01-27T04:47:50.825293546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b5b8d765d-58pbp,Uid:b46e8e69-6e20-4188-9b8d-4e06490f6e72,Namespace:calico-apiserver,Attempt:0,}" Jan 27 04:47:50.929231 systemd-networkd[1495]: calic207fdd731a: Link UP Jan 27 04:47:50.929496 systemd-networkd[1495]: calic207fdd731a: Gained carrier Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.863 [INFO][4954] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0 calico-apiserver-b5b8d765d- calico-apiserver b46e8e69-6e20-4188-9b8d-4e06490f6e72 825 0 2026-01-27 04:47:19 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b5b8d765d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4592-0-0-n-c2731c5fad calico-apiserver-b5b8d765d-58pbp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic207fdd731a [] [] }} ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-58pbp" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.863 [INFO][4954] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-58pbp" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.886 [INFO][4969] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" HandleID="k8s-pod-network.bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Workload="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.886 [INFO][4969] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" HandleID="k8s-pod-network.bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Workload="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c31f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4592-0-0-n-c2731c5fad", "pod":"calico-apiserver-b5b8d765d-58pbp", "timestamp":"2026-01-27 04:47:50.886082176 +0000 UTC"}, Hostname:"ci-4592-0-0-n-c2731c5fad", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.886 [INFO][4969] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.886 [INFO][4969] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.886 [INFO][4969] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4592-0-0-n-c2731c5fad' Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.895 [INFO][4969] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.901 [INFO][4969] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.906 [INFO][4969] ipam/ipam.go 511: Trying affinity for 192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.907 [INFO][4969] ipam/ipam.go 158: Attempting to load block cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.910 [INFO][4969] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.98.0/26 host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.910 [INFO][4969] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.98.0/26 handle="k8s-pod-network.bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.911 [INFO][4969] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090 Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.916 [INFO][4969] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.98.0/26 handle="k8s-pod-network.bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.924 [INFO][4969] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.98.8/26] block=192.168.98.0/26 handle="k8s-pod-network.bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.924 [INFO][4969] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.98.8/26] handle="k8s-pod-network.bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" host="ci-4592-0-0-n-c2731c5fad" Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.924 [INFO][4969] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 27 04:47:50.943643 containerd[1666]: 2026-01-27 04:47:50.924 [INFO][4969] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.98.8/26] IPv6=[] ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" HandleID="k8s-pod-network.bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Workload="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" Jan 27 04:47:50.944997 containerd[1666]: 2026-01-27 04:47:50.926 [INFO][4954] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-58pbp" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0", GenerateName:"calico-apiserver-b5b8d765d-", Namespace:"calico-apiserver", SelfLink:"", UID:"b46e8e69-6e20-4188-9b8d-4e06490f6e72", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b5b8d765d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"", Pod:"calico-apiserver-b5b8d765d-58pbp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic207fdd731a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:50.944997 containerd[1666]: 2026-01-27 04:47:50.926 [INFO][4954] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.98.8/32] ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-58pbp" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" Jan 27 04:47:50.944997 containerd[1666]: 2026-01-27 04:47:50.926 [INFO][4954] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic207fdd731a ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-58pbp" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" Jan 27 04:47:50.944997 containerd[1666]: 2026-01-27 04:47:50.928 [INFO][4954] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-58pbp" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" Jan 27 04:47:50.944997 containerd[1666]: 2026-01-27 04:47:50.928 [INFO][4954] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-58pbp" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0", GenerateName:"calico-apiserver-b5b8d765d-", Namespace:"calico-apiserver", SelfLink:"", UID:"b46e8e69-6e20-4188-9b8d-4e06490f6e72", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.January, 27, 4, 47, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b5b8d765d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4592-0-0-n-c2731c5fad", ContainerID:"bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090", Pod:"calico-apiserver-b5b8d765d-58pbp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic207fdd731a", MAC:"b2:ea:b9:d6:8e:09", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 27 04:47:50.944997 containerd[1666]: 2026-01-27 04:47:50.940 [INFO][4954] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" Namespace="calico-apiserver" Pod="calico-apiserver-b5b8d765d-58pbp" WorkloadEndpoint="ci--4592--0--0--n--c2731c5fad-k8s-calico--apiserver--b5b8d765d--58pbp-eth0" Jan 27 04:47:50.960000 audit[4985]: NETFILTER_CFG table=filter:141 family=2 entries=57 op=nft_register_chain pid=4985 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 27 04:47:50.960000 audit[4985]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=27812 a0=3 a1=ffffdd9273f0 a2=0 a3=ffff9d1d4fa8 items=0 ppid=4155 pid=4985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:50.960000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 27 04:47:50.984729 containerd[1666]: time="2026-01-27T04:47:50.984686520Z" level=info msg="connecting to shim bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090" address="unix:///run/containerd/s/fd3fecd631c4801069c9d9c69f752c8a38afe9b6240ae2d0a7746bd6a64bc278" namespace=k8s.io protocol=ttrpc version=3 Jan 27 04:47:50.994388 kubelet[2905]: E0127 04:47:50.994275 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:47:51.011318 systemd[1]: Started cri-containerd-bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090.scope - libcontainer container bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090. Jan 27 04:47:51.020000 audit: BPF prog-id=254 op=LOAD Jan 27 04:47:51.022590 kernel: kauditd_printk_skb: 211 callbacks suppressed Jan 27 04:47:51.022654 kernel: audit: type=1334 audit(1769489271.020:748): prog-id=254 op=LOAD Jan 27 04:47:51.020000 audit: BPF prog-id=255 op=LOAD Jan 27 04:47:51.023443 kernel: audit: type=1334 audit(1769489271.020:749): prog-id=255 op=LOAD Jan 27 04:47:51.020000 audit[5004]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.026571 kernel: audit: type=1300 audit(1769489271.020:749): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.020000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.030235 kernel: audit: type=1327 audit(1769489271.020:749): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.035589 kernel: audit: type=1334 audit(1769489271.021:750): prog-id=255 op=UNLOAD Jan 27 04:47:51.035729 kernel: audit: type=1300 audit(1769489271.021:750): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.021000 audit: BPF prog-id=255 op=UNLOAD Jan 27 04:47:51.021000 audit[5004]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.021000 audit: BPF prog-id=256 op=LOAD Jan 27 04:47:51.039614 kernel: audit: type=1327 audit(1769489271.021:750): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.039702 kernel: audit: type=1334 audit(1769489271.021:751): prog-id=256 op=LOAD Jan 27 04:47:51.021000 audit[5004]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.042695 kernel: audit: type=1300 audit(1769489271.021:751): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.045786 kernel: audit: type=1327 audit(1769489271.021:751): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.022000 audit: BPF prog-id=257 op=LOAD Jan 27 04:47:51.022000 audit[5004]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.022000 audit: BPF prog-id=257 op=UNLOAD Jan 27 04:47:51.022000 audit[5004]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.022000 audit: BPF prog-id=256 op=UNLOAD Jan 27 04:47:51.022000 audit[5004]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.022000 audit: BPF prog-id=258 op=LOAD Jan 27 04:47:51.022000 audit[5004]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4994 pid=5004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:51.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383264646132633833623433633334616236356365313336343333 Jan 27 04:47:51.065605 containerd[1666]: time="2026-01-27T04:47:51.065540372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b5b8d765d-58pbp,Uid:b46e8e69-6e20-4188-9b8d-4e06490f6e72,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bb82dda2c83b43c34ab65ce1364330a9d7674039a20a7ad15b315a6d812a9090\"" Jan 27 04:47:51.067758 containerd[1666]: time="2026-01-27T04:47:51.067726823Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:47:51.405054 containerd[1666]: time="2026-01-27T04:47:51.404806823Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:51.412139 containerd[1666]: time="2026-01-27T04:47:51.412052660Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:47:51.412400 containerd[1666]: time="2026-01-27T04:47:51.412123540Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:51.412495 kubelet[2905]: E0127 04:47:51.412322 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:47:51.412495 kubelet[2905]: E0127 04:47:51.412369 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:47:51.412666 kubelet[2905]: E0127 04:47:51.412557 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wb7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-58pbp_calico-apiserver(b46e8e69-6e20-4188-9b8d-4e06490f6e72): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:51.414043 kubelet[2905]: E0127 04:47:51.413940 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:47:51.443374 systemd-networkd[1495]: calif709bb95935: Gained IPv6LL Jan 27 04:47:51.994439 kubelet[2905]: E0127 04:47:51.994364 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:47:51.995340 kubelet[2905]: E0127 04:47:51.995299 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:47:52.028000 audit[5033]: NETFILTER_CFG table=filter:142 family=2 entries=14 op=nft_register_rule pid=5033 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:52.028000 audit[5033]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc80bde00 a2=0 a3=1 items=0 ppid=3073 pid=5033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:52.028000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:52.042000 audit[5033]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5033 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:47:52.042000 audit[5033]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc80bde00 a2=0 a3=1 items=0 ppid=3073 pid=5033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:47:52.042000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:47:52.211287 systemd-networkd[1495]: calic207fdd731a: Gained IPv6LL Jan 27 04:47:52.996234 kubelet[2905]: E0127 04:47:52.996191 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:47:56.825839 containerd[1666]: time="2026-01-27T04:47:56.825760320Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 04:47:57.165789 containerd[1666]: time="2026-01-27T04:47:57.165659374Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:57.167157 containerd[1666]: time="2026-01-27T04:47:57.167121101Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 04:47:57.167257 containerd[1666]: time="2026-01-27T04:47:57.167200862Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:57.167370 kubelet[2905]: E0127 04:47:57.167326 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:47:57.167641 kubelet[2905]: E0127 04:47:57.167382 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:47:57.167641 kubelet[2905]: E0127 04:47:57.167485 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1c7c961a1fbc4629aa41d023389e3c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:57.169573 containerd[1666]: time="2026-01-27T04:47:57.169549474Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 04:47:57.500544 containerd[1666]: time="2026-01-27T04:47:57.500429122Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:47:57.503436 containerd[1666]: time="2026-01-27T04:47:57.503382217Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 04:47:57.503611 containerd[1666]: time="2026-01-27T04:47:57.503473377Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 04:47:57.504115 kubelet[2905]: E0127 04:47:57.503747 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:47:57.504590 kubelet[2905]: E0127 04:47:57.504235 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:47:57.504590 kubelet[2905]: E0127 04:47:57.504371 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 04:47:57.505600 kubelet[2905]: E0127 04:47:57.505551 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:48:00.826802 containerd[1666]: time="2026-01-27T04:48:00.826631412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:48:01.167633 containerd[1666]: time="2026-01-27T04:48:01.167409030Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:01.168858 containerd[1666]: time="2026-01-27T04:48:01.168801237Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:48:01.169067 containerd[1666]: time="2026-01-27T04:48:01.168840477Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:01.169118 kubelet[2905]: E0127 04:48:01.169031 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:48:01.169118 kubelet[2905]: E0127 04:48:01.169078 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:48:01.169450 kubelet[2905]: E0127 04:48:01.169209 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tvs85,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-qg4kw_calico-apiserver(e725b346-d7db-48ca-8580-5074b068cd87): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:01.170429 kubelet[2905]: E0127 04:48:01.170373 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:48:03.826772 containerd[1666]: time="2026-01-27T04:48:03.826733118Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 04:48:04.161304 containerd[1666]: time="2026-01-27T04:48:04.160990663Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:04.164812 containerd[1666]: time="2026-01-27T04:48:04.164765842Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 04:48:04.164927 containerd[1666]: time="2026-01-27T04:48:04.164857283Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:04.165147 kubelet[2905]: E0127 04:48:04.165044 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:48:04.165147 kubelet[2905]: E0127 04:48:04.165111 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:48:04.165558 kubelet[2905]: E0127 04:48:04.165249 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zbn74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-trbp2_calico-system(1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:04.166481 kubelet[2905]: E0127 04:48:04.166438 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:48:04.826216 containerd[1666]: time="2026-01-27T04:48:04.826149696Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 04:48:05.200293 containerd[1666]: time="2026-01-27T04:48:05.200132644Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:05.209348 containerd[1666]: time="2026-01-27T04:48:05.209216691Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 04:48:05.209683 containerd[1666]: time="2026-01-27T04:48:05.209249811Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:05.209828 kubelet[2905]: E0127 04:48:05.209772 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:48:05.210301 kubelet[2905]: E0127 04:48:05.209918 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:48:05.211337 kubelet[2905]: E0127 04:48:05.211287 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmpx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-799789d486-wpdhm_calico-system(5edc651f-3273-46b9-a554-3e38c11ea910): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:05.212972 kubelet[2905]: E0127 04:48:05.212937 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:48:06.826713 containerd[1666]: time="2026-01-27T04:48:06.826604502Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 04:48:07.153233 containerd[1666]: time="2026-01-27T04:48:07.153063568Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:07.155360 containerd[1666]: time="2026-01-27T04:48:07.155266099Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 04:48:07.155541 containerd[1666]: time="2026-01-27T04:48:07.155354700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:07.155887 kubelet[2905]: E0127 04:48:07.155820 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:48:07.156460 kubelet[2905]: E0127 04:48:07.156296 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:48:07.156785 kubelet[2905]: E0127 04:48:07.156540 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:07.156923 containerd[1666]: time="2026-01-27T04:48:07.156750187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:48:07.480849 containerd[1666]: time="2026-01-27T04:48:07.480745800Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:07.483644 containerd[1666]: time="2026-01-27T04:48:07.483550454Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:48:07.483644 containerd[1666]: time="2026-01-27T04:48:07.483594934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:07.483797 kubelet[2905]: E0127 04:48:07.483753 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:48:07.483873 kubelet[2905]: E0127 04:48:07.483802 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:48:07.484052 kubelet[2905]: E0127 04:48:07.484003 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wb7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-58pbp_calico-apiserver(b46e8e69-6e20-4188-9b8d-4e06490f6e72): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:07.484524 containerd[1666]: time="2026-01-27T04:48:07.484209177Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 04:48:07.485353 kubelet[2905]: E0127 04:48:07.485321 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:48:07.813926 containerd[1666]: time="2026-01-27T04:48:07.813792859Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:07.815881 containerd[1666]: time="2026-01-27T04:48:07.815832389Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 04:48:07.815960 containerd[1666]: time="2026-01-27T04:48:07.815917830Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:07.816117 kubelet[2905]: E0127 04:48:07.816069 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:48:07.816174 kubelet[2905]: E0127 04:48:07.816125 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:48:07.816287 kubelet[2905]: E0127 04:48:07.816238 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:07.817581 kubelet[2905]: E0127 04:48:07.817528 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:48:12.826497 kubelet[2905]: E0127 04:48:12.826434 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:48:14.826171 kubelet[2905]: E0127 04:48:14.825658 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:48:19.828119 kubelet[2905]: E0127 04:48:19.827501 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:48:19.828119 kubelet[2905]: E0127 04:48:19.827611 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:48:19.830710 kubelet[2905]: E0127 04:48:19.828710 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:48:21.825733 kubelet[2905]: E0127 04:48:21.825656 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:48:25.830680 containerd[1666]: time="2026-01-27T04:48:25.830414737Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:48:26.175600 containerd[1666]: time="2026-01-27T04:48:26.175356017Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:26.178054 containerd[1666]: time="2026-01-27T04:48:26.177946110Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:48:26.178163 containerd[1666]: time="2026-01-27T04:48:26.178043950Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:26.178518 kubelet[2905]: E0127 04:48:26.178294 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:48:26.178518 kubelet[2905]: E0127 04:48:26.178343 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:48:26.178518 kubelet[2905]: E0127 04:48:26.178464 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tvs85,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-qg4kw_calico-apiserver(e725b346-d7db-48ca-8580-5074b068cd87): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:26.179787 kubelet[2905]: E0127 04:48:26.179742 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:48:26.826117 containerd[1666]: time="2026-01-27T04:48:26.826010256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 04:48:27.189544 containerd[1666]: time="2026-01-27T04:48:27.189382870Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:27.190978 containerd[1666]: time="2026-01-27T04:48:27.190938318Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 04:48:27.191058 containerd[1666]: time="2026-01-27T04:48:27.191021158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:27.191272 kubelet[2905]: E0127 04:48:27.191227 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:48:27.191687 kubelet[2905]: E0127 04:48:27.191282 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:48:27.191687 kubelet[2905]: E0127 04:48:27.191380 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1c7c961a1fbc4629aa41d023389e3c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:27.193263 containerd[1666]: time="2026-01-27T04:48:27.193237050Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 04:48:27.537113 containerd[1666]: time="2026-01-27T04:48:27.536076319Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:27.543901 containerd[1666]: time="2026-01-27T04:48:27.543844958Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 04:48:27.544044 containerd[1666]: time="2026-01-27T04:48:27.543897919Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:27.544233 kubelet[2905]: E0127 04:48:27.544188 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:48:27.544492 kubelet[2905]: E0127 04:48:27.544317 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:48:27.544492 kubelet[2905]: E0127 04:48:27.544436 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:27.545675 kubelet[2905]: E0127 04:48:27.545586 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:48:31.829167 containerd[1666]: time="2026-01-27T04:48:31.828923780Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 04:48:32.169836 containerd[1666]: time="2026-01-27T04:48:32.169536878Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:32.172321 containerd[1666]: time="2026-01-27T04:48:32.172280452Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 04:48:32.172411 containerd[1666]: time="2026-01-27T04:48:32.172361692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:32.172601 kubelet[2905]: E0127 04:48:32.172542 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:48:32.172935 kubelet[2905]: E0127 04:48:32.172607 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:48:32.172935 kubelet[2905]: E0127 04:48:32.172816 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:32.173063 containerd[1666]: time="2026-01-27T04:48:32.172941615Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 04:48:32.493359 containerd[1666]: time="2026-01-27T04:48:32.493180329Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:32.497226 containerd[1666]: time="2026-01-27T04:48:32.497066309Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 04:48:32.497226 containerd[1666]: time="2026-01-27T04:48:32.497112949Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:32.497699 kubelet[2905]: E0127 04:48:32.497499 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:48:32.497699 kubelet[2905]: E0127 04:48:32.497550 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:48:32.497834 kubelet[2905]: E0127 04:48:32.497769 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmpx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-799789d486-wpdhm_calico-system(5edc651f-3273-46b9-a554-3e38c11ea910): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:32.499353 containerd[1666]: time="2026-01-27T04:48:32.499317480Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 04:48:32.499646 kubelet[2905]: E0127 04:48:32.499611 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:48:32.823057 containerd[1666]: time="2026-01-27T04:48:32.822332168Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:32.828054 containerd[1666]: time="2026-01-27T04:48:32.827851877Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 04:48:32.828054 containerd[1666]: time="2026-01-27T04:48:32.827904997Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:32.828207 kubelet[2905]: E0127 04:48:32.828072 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:48:32.828207 kubelet[2905]: E0127 04:48:32.828131 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:48:32.828414 kubelet[2905]: E0127 04:48:32.828323 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:32.828414 kubelet[2905]: E0127 04:48:32.829462 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:48:32.830035 containerd[1666]: time="2026-01-27T04:48:32.828390519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:48:33.164438 containerd[1666]: time="2026-01-27T04:48:33.164007272Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:33.169270 containerd[1666]: time="2026-01-27T04:48:33.169160738Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:48:33.169270 containerd[1666]: time="2026-01-27T04:48:33.169215898Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:33.169421 kubelet[2905]: E0127 04:48:33.169376 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:48:33.169466 kubelet[2905]: E0127 04:48:33.169425 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:48:33.170122 kubelet[2905]: E0127 04:48:33.169547 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wb7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-58pbp_calico-apiserver(b46e8e69-6e20-4188-9b8d-4e06490f6e72): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:33.171041 kubelet[2905]: E0127 04:48:33.170989 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:48:34.825961 containerd[1666]: time="2026-01-27T04:48:34.825922870Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 04:48:35.160299 containerd[1666]: time="2026-01-27T04:48:35.160182896Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:48:35.161962 containerd[1666]: time="2026-01-27T04:48:35.161921985Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 04:48:35.162047 containerd[1666]: time="2026-01-27T04:48:35.161966465Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 04:48:35.162222 kubelet[2905]: E0127 04:48:35.162177 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:48:35.162494 kubelet[2905]: E0127 04:48:35.162235 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:48:35.162494 kubelet[2905]: E0127 04:48:35.162402 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zbn74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-trbp2_calico-system(1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 04:48:35.163882 kubelet[2905]: E0127 04:48:35.163792 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:48:40.826702 kubelet[2905]: E0127 04:48:40.826258 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:48:41.827138 kubelet[2905]: E0127 04:48:41.825740 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:48:45.826146 kubelet[2905]: E0127 04:48:45.825810 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:48:45.829765 kubelet[2905]: E0127 04:48:45.829710 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:48:46.825906 kubelet[2905]: E0127 04:48:46.825853 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:48:46.826161 kubelet[2905]: E0127 04:48:46.825890 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:48:51.827485 kubelet[2905]: E0127 04:48:51.827430 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:48:53.832868 kubelet[2905]: E0127 04:48:53.832633 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:48:59.827470 kubelet[2905]: E0127 04:48:59.827416 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:48:59.828761 kubelet[2905]: E0127 04:48:59.827790 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:49:00.825625 kubelet[2905]: E0127 04:49:00.825564 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:49:01.828615 kubelet[2905]: E0127 04:49:01.828369 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:49:02.825488 kubelet[2905]: E0127 04:49:02.825379 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:49:05.828123 kubelet[2905]: E0127 04:49:05.828049 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:49:10.826203 kubelet[2905]: E0127 04:49:10.825290 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:49:12.826590 kubelet[2905]: E0127 04:49:12.826539 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:49:12.827033 containerd[1666]: time="2026-01-27T04:49:12.826751545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 04:49:13.197853 containerd[1666]: time="2026-01-27T04:49:13.197801398Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:49:13.202828 containerd[1666]: time="2026-01-27T04:49:13.202754783Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 04:49:13.202917 containerd[1666]: time="2026-01-27T04:49:13.202871824Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 04:49:13.203039 kubelet[2905]: E0127 04:49:13.202960 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:49:13.203088 kubelet[2905]: E0127 04:49:13.203043 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:49:13.203749 kubelet[2905]: E0127 04:49:13.203684 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmpx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-799789d486-wpdhm_calico-system(5edc651f-3273-46b9-a554-3e38c11ea910): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 04:49:13.204938 kubelet[2905]: E0127 04:49:13.204872 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:49:14.825973 containerd[1666]: time="2026-01-27T04:49:14.825812423Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:49:15.164535 containerd[1666]: time="2026-01-27T04:49:15.164388871Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:49:15.169692 containerd[1666]: time="2026-01-27T04:49:15.169594217Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:49:15.169781 containerd[1666]: time="2026-01-27T04:49:15.169665618Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:49:15.169817 kubelet[2905]: E0127 04:49:15.169751 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:49:15.170102 kubelet[2905]: E0127 04:49:15.169814 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:49:15.170102 kubelet[2905]: E0127 04:49:15.169932 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tvs85,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-qg4kw_calico-apiserver(e725b346-d7db-48ca-8580-5074b068cd87): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:49:15.171131 kubelet[2905]: E0127 04:49:15.171103 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:49:15.826295 containerd[1666]: time="2026-01-27T04:49:15.826254488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 04:49:16.153627 containerd[1666]: time="2026-01-27T04:49:16.153509597Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:49:16.155801 containerd[1666]: time="2026-01-27T04:49:16.155748729Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 04:49:16.155977 containerd[1666]: time="2026-01-27T04:49:16.155815089Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 04:49:16.156057 kubelet[2905]: E0127 04:49:16.155997 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:49:16.156121 kubelet[2905]: E0127 04:49:16.156069 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:49:16.156238 kubelet[2905]: E0127 04:49:16.156199 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 04:49:16.158112 containerd[1666]: time="2026-01-27T04:49:16.158063780Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 04:49:16.484181 containerd[1666]: time="2026-01-27T04:49:16.483854243Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:49:16.486415 containerd[1666]: time="2026-01-27T04:49:16.486349175Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 04:49:16.486479 containerd[1666]: time="2026-01-27T04:49:16.486435416Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 04:49:16.486607 kubelet[2905]: E0127 04:49:16.486569 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:49:16.486882 kubelet[2905]: E0127 04:49:16.486618 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:49:16.486882 kubelet[2905]: E0127 04:49:16.486726 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 04:49:16.487967 kubelet[2905]: E0127 04:49:16.487893 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:49:19.826164 containerd[1666]: time="2026-01-27T04:49:19.826039454Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 04:49:20.162376 containerd[1666]: time="2026-01-27T04:49:20.162242729Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:49:20.163762 containerd[1666]: time="2026-01-27T04:49:20.163722337Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 04:49:20.163918 containerd[1666]: time="2026-01-27T04:49:20.163802817Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 04:49:20.163999 kubelet[2905]: E0127 04:49:20.163940 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:49:20.164398 kubelet[2905]: E0127 04:49:20.164009 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:49:20.164398 kubelet[2905]: E0127 04:49:20.164133 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1c7c961a1fbc4629aa41d023389e3c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 04:49:20.166651 containerd[1666]: time="2026-01-27T04:49:20.166623871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 04:49:20.501940 containerd[1666]: time="2026-01-27T04:49:20.501864502Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:49:20.504998 containerd[1666]: time="2026-01-27T04:49:20.504948998Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 04:49:20.505135 containerd[1666]: time="2026-01-27T04:49:20.505049598Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 04:49:20.505249 kubelet[2905]: E0127 04:49:20.505210 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:49:20.505409 kubelet[2905]: E0127 04:49:20.505379 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:49:20.505552 kubelet[2905]: E0127 04:49:20.505513 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 04:49:20.506975 kubelet[2905]: E0127 04:49:20.506907 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:49:21.827297 containerd[1666]: time="2026-01-27T04:49:21.827245264Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:49:22.332341 containerd[1666]: time="2026-01-27T04:49:22.332291040Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:49:22.336652 containerd[1666]: time="2026-01-27T04:49:22.336602422Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:49:22.336739 containerd[1666]: time="2026-01-27T04:49:22.336682463Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:49:22.336898 kubelet[2905]: E0127 04:49:22.336850 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:49:22.337197 kubelet[2905]: E0127 04:49:22.336915 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:49:22.337197 kubelet[2905]: E0127 04:49:22.337033 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wb7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-58pbp_calico-apiserver(b46e8e69-6e20-4188-9b8d-4e06490f6e72): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:49:22.338725 kubelet[2905]: E0127 04:49:22.338688 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:49:23.826240 containerd[1666]: time="2026-01-27T04:49:23.826196182Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 04:49:24.161843 containerd[1666]: time="2026-01-27T04:49:24.161385212Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:49:24.163892 containerd[1666]: time="2026-01-27T04:49:24.163783584Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 04:49:24.164223 containerd[1666]: time="2026-01-27T04:49:24.163877105Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 04:49:24.164258 kubelet[2905]: E0127 04:49:24.163994 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:49:24.164258 kubelet[2905]: E0127 04:49:24.164048 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:49:24.164970 kubelet[2905]: E0127 04:49:24.164669 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zbn74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-trbp2_calico-system(1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 04:49:24.166737 kubelet[2905]: E0127 04:49:24.166029 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:49:26.827240 kubelet[2905]: E0127 04:49:26.826687 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:49:27.829523 kubelet[2905]: E0127 04:49:27.829472 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:49:27.830761 kubelet[2905]: E0127 04:49:27.829635 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:49:28.183210 kernel: kauditd_printk_skb: 18 callbacks suppressed Jan 27 04:49:28.183322 kernel: audit: type=1130 audit(1769489368.181:758): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.3.32:22-4.153.228.146:45840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:28.181000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.3.32:22-4.153.228.146:45840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:28.182271 systemd[1]: Started sshd@9-10.0.3.32:22-4.153.228.146:45840.service - OpenSSH per-connection server daemon (4.153.228.146:45840). Jan 27 04:49:28.719000 audit[5198]: USER_ACCT pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:28.724157 kernel: audit: type=1101 audit(1769489368.719:759): pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:28.724230 sshd[5198]: Accepted publickey for core from 4.153.228.146 port 45840 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:49:28.726123 sshd-session[5198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:49:28.724000 audit[5198]: CRED_ACQ pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:28.730816 kernel: audit: type=1103 audit(1769489368.724:760): pid=5198 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:28.731149 kernel: audit: type=1006 audit(1769489368.724:761): pid=5198 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 27 04:49:28.731176 kernel: audit: type=1300 audit(1769489368.724:761): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc5ee4fe0 a2=3 a3=0 items=0 ppid=1 pid=5198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:28.724000 audit[5198]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc5ee4fe0 a2=3 a3=0 items=0 ppid=1 pid=5198 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:28.724000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:28.735383 kernel: audit: type=1327 audit(1769489368.724:761): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:28.738614 systemd-logind[1653]: New session 11 of user core. Jan 27 04:49:28.748328 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 27 04:49:28.750000 audit[5198]: USER_START pid=5198 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:28.752000 audit[5202]: CRED_ACQ pid=5202 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:28.758245 kernel: audit: type=1105 audit(1769489368.750:762): pid=5198 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:28.758368 kernel: audit: type=1103 audit(1769489368.752:763): pid=5202 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:29.092231 sshd[5202]: Connection closed by 4.153.228.146 port 45840 Jan 27 04:49:29.092725 sshd-session[5198]: pam_unix(sshd:session): session closed for user core Jan 27 04:49:29.093000 audit[5198]: USER_END pid=5198 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:29.097271 systemd[1]: sshd@9-10.0.3.32:22-4.153.228.146:45840.service: Deactivated successfully. Jan 27 04:49:29.093000 audit[5198]: CRED_DISP pid=5198 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:29.101120 kernel: audit: type=1106 audit(1769489369.093:764): pid=5198 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:29.101187 kernel: audit: type=1104 audit(1769489369.093:765): pid=5198 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:29.099638 systemd[1]: session-11.scope: Deactivated successfully. Jan 27 04:49:29.101792 systemd-logind[1653]: Session 11 logged out. Waiting for processes to exit. Jan 27 04:49:29.096000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.3.32:22-4.153.228.146:45840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:29.103016 systemd-logind[1653]: Removed session 11. Jan 27 04:49:33.827424 kubelet[2905]: E0127 04:49:33.827365 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:49:34.202966 systemd[1]: Started sshd@10-10.0.3.32:22-4.153.228.146:45852.service - OpenSSH per-connection server daemon (4.153.228.146:45852). Jan 27 04:49:34.203327 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 04:49:34.203376 kernel: audit: type=1130 audit(1769489374.201:767): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.32:22-4.153.228.146:45852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:34.201000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.32:22-4.153.228.146:45852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:34.749000 audit[5218]: USER_ACCT pid=5218 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:34.752309 sshd[5218]: Accepted publickey for core from 4.153.228.146 port 45852 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:49:34.752000 audit[5218]: CRED_ACQ pid=5218 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:34.754322 sshd-session[5218]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:49:34.756146 kernel: audit: type=1101 audit(1769489374.749:768): pid=5218 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:34.756247 kernel: audit: type=1103 audit(1769489374.752:769): pid=5218 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:34.756270 kernel: audit: type=1006 audit(1769489374.752:770): pid=5218 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 27 04:49:34.752000 audit[5218]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffa0fc030 a2=3 a3=0 items=0 ppid=1 pid=5218 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:34.760448 systemd-logind[1653]: New session 12 of user core. Jan 27 04:49:34.760982 kernel: audit: type=1300 audit(1769489374.752:770): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffa0fc030 a2=3 a3=0 items=0 ppid=1 pid=5218 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:34.752000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:34.762215 kernel: audit: type=1327 audit(1769489374.752:770): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:34.767436 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 27 04:49:34.769000 audit[5218]: USER_START pid=5218 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:34.773000 audit[5222]: CRED_ACQ pid=5222 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:34.777914 kernel: audit: type=1105 audit(1769489374.769:771): pid=5218 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:34.778013 kernel: audit: type=1103 audit(1769489374.773:772): pid=5222 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:35.117985 sshd[5222]: Connection closed by 4.153.228.146 port 45852 Jan 27 04:49:35.118000 audit[5218]: USER_END pid=5218 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:35.117908 sshd-session[5218]: pam_unix(sshd:session): session closed for user core Jan 27 04:49:35.123376 systemd[1]: sshd@10-10.0.3.32:22-4.153.228.146:45852.service: Deactivated successfully. Jan 27 04:49:35.118000 audit[5218]: CRED_DISP pid=5218 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:35.126771 kernel: audit: type=1106 audit(1769489375.118:773): pid=5218 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:35.126870 kernel: audit: type=1104 audit(1769489375.118:774): pid=5218 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:35.128467 systemd[1]: session-12.scope: Deactivated successfully. Jan 27 04:49:35.131344 systemd-logind[1653]: Session 12 logged out. Waiting for processes to exit. Jan 27 04:49:35.125000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.32:22-4.153.228.146:45852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:35.134477 systemd-logind[1653]: Removed session 12. Jan 27 04:49:35.828408 kubelet[2905]: E0127 04:49:35.828057 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:49:37.828594 kubelet[2905]: E0127 04:49:37.828441 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:49:38.826193 kubelet[2905]: E0127 04:49:38.826149 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:49:39.826427 kubelet[2905]: E0127 04:49:39.826289 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:49:40.226904 systemd[1]: Started sshd@11-10.0.3.32:22-4.153.228.146:46598.service - OpenSSH per-connection server daemon (4.153.228.146:46598). Jan 27 04:49:40.227711 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 04:49:40.227868 kernel: audit: type=1130 audit(1769489380.225:776): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.32:22-4.153.228.146:46598 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:40.225000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.32:22-4.153.228.146:46598 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:40.755000 audit[5236]: USER_ACCT pid=5236 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:40.757138 sshd[5236]: Accepted publickey for core from 4.153.228.146 port 46598 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:49:40.759275 sshd-session[5236]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:49:40.757000 audit[5236]: CRED_ACQ pid=5236 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:40.762860 kernel: audit: type=1101 audit(1769489380.755:777): pid=5236 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:40.762950 kernel: audit: type=1103 audit(1769489380.757:778): pid=5236 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:40.762972 kernel: audit: type=1006 audit(1769489380.757:779): pid=5236 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 27 04:49:40.757000 audit[5236]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff4147740 a2=3 a3=0 items=0 ppid=1 pid=5236 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:40.766887 systemd-logind[1653]: New session 13 of user core. Jan 27 04:49:40.767678 kernel: audit: type=1300 audit(1769489380.757:779): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff4147740 a2=3 a3=0 items=0 ppid=1 pid=5236 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:40.757000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:40.768860 kernel: audit: type=1327 audit(1769489380.757:779): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:40.772297 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 27 04:49:40.773000 audit[5236]: USER_START pid=5236 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:40.775000 audit[5240]: CRED_ACQ pid=5240 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:40.780821 kernel: audit: type=1105 audit(1769489380.773:780): pid=5236 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:40.780891 kernel: audit: type=1103 audit(1769489380.775:781): pid=5240 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:40.827411 kubelet[2905]: E0127 04:49:40.827339 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:49:41.122627 sshd[5240]: Connection closed by 4.153.228.146 port 46598 Jan 27 04:49:41.121539 sshd-session[5236]: pam_unix(sshd:session): session closed for user core Jan 27 04:49:41.121000 audit[5236]: USER_END pid=5236 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:41.126255 systemd[1]: sshd@11-10.0.3.32:22-4.153.228.146:46598.service: Deactivated successfully. Jan 27 04:49:41.121000 audit[5236]: CRED_DISP pid=5236 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:41.128793 systemd[1]: session-13.scope: Deactivated successfully. Jan 27 04:49:41.129022 kernel: audit: type=1106 audit(1769489381.121:782): pid=5236 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:41.129066 kernel: audit: type=1104 audit(1769489381.121:783): pid=5236 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:41.126000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.32:22-4.153.228.146:46598 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:41.130185 systemd-logind[1653]: Session 13 logged out. Waiting for processes to exit. Jan 27 04:49:41.130961 systemd-logind[1653]: Removed session 13. Jan 27 04:49:41.237000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.3.32:22-4.153.228.146:46606 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:41.238776 systemd[1]: Started sshd@12-10.0.3.32:22-4.153.228.146:46606.service - OpenSSH per-connection server daemon (4.153.228.146:46606). Jan 27 04:49:41.778000 audit[5256]: USER_ACCT pid=5256 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:41.780050 sshd[5256]: Accepted publickey for core from 4.153.228.146 port 46606 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:49:41.780000 audit[5256]: CRED_ACQ pid=5256 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:41.780000 audit[5256]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc5041d00 a2=3 a3=0 items=0 ppid=1 pid=5256 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:41.780000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:41.782439 sshd-session[5256]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:49:41.787400 systemd-logind[1653]: New session 14 of user core. Jan 27 04:49:41.797296 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 27 04:49:41.798000 audit[5256]: USER_START pid=5256 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:41.800000 audit[5260]: CRED_ACQ pid=5260 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:42.172748 sshd[5260]: Connection closed by 4.153.228.146 port 46606 Jan 27 04:49:42.173013 sshd-session[5256]: pam_unix(sshd:session): session closed for user core Jan 27 04:49:42.173000 audit[5256]: USER_END pid=5256 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:42.174000 audit[5256]: CRED_DISP pid=5256 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:42.177826 systemd[1]: sshd@12-10.0.3.32:22-4.153.228.146:46606.service: Deactivated successfully. Jan 27 04:49:42.177000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.3.32:22-4.153.228.146:46606 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:42.179673 systemd[1]: session-14.scope: Deactivated successfully. Jan 27 04:49:42.181645 systemd-logind[1653]: Session 14 logged out. Waiting for processes to exit. Jan 27 04:49:42.183544 systemd-logind[1653]: Removed session 14. Jan 27 04:49:42.277322 systemd[1]: Started sshd@13-10.0.3.32:22-4.153.228.146:46612.service - OpenSSH per-connection server daemon (4.153.228.146:46612). Jan 27 04:49:42.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.3.32:22-4.153.228.146:46612 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:42.799000 audit[5271]: USER_ACCT pid=5271 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:42.800326 sshd[5271]: Accepted publickey for core from 4.153.228.146 port 46612 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:49:42.800000 audit[5271]: CRED_ACQ pid=5271 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:42.800000 audit[5271]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdfe36100 a2=3 a3=0 items=0 ppid=1 pid=5271 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:42.800000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:42.801924 sshd-session[5271]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:49:42.805639 systemd-logind[1653]: New session 15 of user core. Jan 27 04:49:42.812327 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 27 04:49:42.814000 audit[5271]: USER_START pid=5271 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:42.815000 audit[5275]: CRED_ACQ pid=5275 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:43.155342 sshd[5275]: Connection closed by 4.153.228.146 port 46612 Jan 27 04:49:43.155553 sshd-session[5271]: pam_unix(sshd:session): session closed for user core Jan 27 04:49:43.157000 audit[5271]: USER_END pid=5271 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:43.157000 audit[5271]: CRED_DISP pid=5271 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:43.162511 systemd[1]: sshd@13-10.0.3.32:22-4.153.228.146:46612.service: Deactivated successfully. Jan 27 04:49:43.161000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.3.32:22-4.153.228.146:46612 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:43.165923 systemd[1]: session-15.scope: Deactivated successfully. Jan 27 04:49:43.166758 systemd-logind[1653]: Session 15 logged out. Waiting for processes to exit. Jan 27 04:49:43.168589 systemd-logind[1653]: Removed session 15. Jan 27 04:49:45.831411 kubelet[2905]: E0127 04:49:45.831326 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:49:48.263906 systemd[1]: Started sshd@14-10.0.3.32:22-4.153.228.146:36720.service - OpenSSH per-connection server daemon (4.153.228.146:36720). Jan 27 04:49:48.263000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.32:22-4.153.228.146:36720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:48.267205 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 27 04:49:48.267282 kernel: audit: type=1130 audit(1769489388.263:803): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.32:22-4.153.228.146:36720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:48.802813 sshd[5318]: Accepted publickey for core from 4.153.228.146 port 36720 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:49:48.801000 audit[5318]: USER_ACCT pid=5318 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:48.806635 sshd-session[5318]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:49:48.804000 audit[5318]: CRED_ACQ pid=5318 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:48.809468 kernel: audit: type=1101 audit(1769489388.801:804): pid=5318 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:48.809528 kernel: audit: type=1103 audit(1769489388.804:805): pid=5318 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:48.811412 kernel: audit: type=1006 audit(1769489388.804:806): pid=5318 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 27 04:49:48.811529 kernel: audit: type=1300 audit(1769489388.804:806): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff10ced30 a2=3 a3=0 items=0 ppid=1 pid=5318 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:48.804000 audit[5318]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff10ced30 a2=3 a3=0 items=0 ppid=1 pid=5318 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:48.804000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:48.816143 kernel: audit: type=1327 audit(1769489388.804:806): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:48.818441 systemd-logind[1653]: New session 16 of user core. Jan 27 04:49:48.826258 kubelet[2905]: E0127 04:49:48.826208 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:49:48.826596 kubelet[2905]: E0127 04:49:48.826305 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:49:48.833391 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 27 04:49:48.835000 audit[5318]: USER_START pid=5318 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:48.838000 audit[5322]: CRED_ACQ pid=5322 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:48.843317 kernel: audit: type=1105 audit(1769489388.835:807): pid=5318 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:48.843415 kernel: audit: type=1103 audit(1769489388.838:808): pid=5322 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:49.167619 sshd[5322]: Connection closed by 4.153.228.146 port 36720 Jan 27 04:49:49.167000 sshd-session[5318]: pam_unix(sshd:session): session closed for user core Jan 27 04:49:49.168000 audit[5318]: USER_END pid=5318 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:49.168000 audit[5318]: CRED_DISP pid=5318 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:49.178440 kernel: audit: type=1106 audit(1769489389.168:809): pid=5318 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:49.178525 kernel: audit: type=1104 audit(1769489389.168:810): pid=5318 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:49.176436 systemd[1]: sshd@14-10.0.3.32:22-4.153.228.146:36720.service: Deactivated successfully. Jan 27 04:49:49.178603 systemd[1]: session-16.scope: Deactivated successfully. Jan 27 04:49:49.179479 systemd-logind[1653]: Session 16 logged out. Waiting for processes to exit. Jan 27 04:49:49.175000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.32:22-4.153.228.146:36720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:49.181688 systemd-logind[1653]: Removed session 16. Jan 27 04:49:50.826195 kubelet[2905]: E0127 04:49:50.825737 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:49:52.826567 kubelet[2905]: E0127 04:49:52.826485 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:49:54.283280 systemd[1]: Started sshd@15-10.0.3.32:22-4.153.228.146:36722.service - OpenSSH per-connection server daemon (4.153.228.146:36722). Jan 27 04:49:54.282000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.32:22-4.153.228.146:36722 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:54.286328 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 04:49:54.286378 kernel: audit: type=1130 audit(1769489394.282:812): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.32:22-4.153.228.146:36722 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:54.803000 audit[5336]: USER_ACCT pid=5336 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:54.805150 sshd[5336]: Accepted publickey for core from 4.153.228.146 port 36722 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:49:54.808614 kernel: audit: type=1101 audit(1769489394.803:813): pid=5336 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:54.808702 kernel: audit: type=1103 audit(1769489394.807:814): pid=5336 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:54.807000 audit[5336]: CRED_ACQ pid=5336 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:54.809109 sshd-session[5336]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:49:54.812575 kernel: audit: type=1006 audit(1769489394.807:815): pid=5336 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 27 04:49:54.807000 audit[5336]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe7e1f460 a2=3 a3=0 items=0 ppid=1 pid=5336 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:54.815722 kernel: audit: type=1300 audit(1769489394.807:815): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe7e1f460 a2=3 a3=0 items=0 ppid=1 pid=5336 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:49:54.807000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:54.818057 kernel: audit: type=1327 audit(1769489394.807:815): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:49:54.819421 systemd-logind[1653]: New session 17 of user core. Jan 27 04:49:54.827229 kubelet[2905]: E0127 04:49:54.827145 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:49:54.831427 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 27 04:49:54.834000 audit[5336]: USER_START pid=5336 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:54.838000 audit[5340]: CRED_ACQ pid=5340 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:54.842175 kernel: audit: type=1105 audit(1769489394.834:816): pid=5336 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:54.842255 kernel: audit: type=1103 audit(1769489394.838:817): pid=5340 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:55.159119 sshd[5340]: Connection closed by 4.153.228.146 port 36722 Jan 27 04:49:55.159580 sshd-session[5336]: pam_unix(sshd:session): session closed for user core Jan 27 04:49:55.160000 audit[5336]: USER_END pid=5336 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:55.164339 systemd[1]: sshd@15-10.0.3.32:22-4.153.228.146:36722.service: Deactivated successfully. Jan 27 04:49:55.160000 audit[5336]: CRED_DISP pid=5336 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:55.166262 systemd[1]: session-17.scope: Deactivated successfully. Jan 27 04:49:55.166995 systemd-logind[1653]: Session 17 logged out. Waiting for processes to exit. Jan 27 04:49:55.167898 kernel: audit: type=1106 audit(1769489395.160:818): pid=5336 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:55.167956 kernel: audit: type=1104 audit(1769489395.160:819): pid=5336 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:49:55.163000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.32:22-4.153.228.146:36722 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:49:55.168860 systemd-logind[1653]: Removed session 17. Jan 27 04:49:59.827016 kubelet[2905]: E0127 04:49:59.826934 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:50:00.267813 systemd[1]: Started sshd@16-10.0.3.32:22-4.153.228.146:47930.service - OpenSSH per-connection server daemon (4.153.228.146:47930). Jan 27 04:50:00.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.32:22-4.153.228.146:47930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:00.271289 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 04:50:00.271374 kernel: audit: type=1130 audit(1769489400.267:821): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.32:22-4.153.228.146:47930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:00.793000 audit[5354]: USER_ACCT pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:00.795582 sshd[5354]: Accepted publickey for core from 4.153.228.146 port 47930 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:00.799071 kernel: audit: type=1101 audit(1769489400.793:822): pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:00.799149 kernel: audit: type=1103 audit(1769489400.797:823): pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:00.797000 audit[5354]: CRED_ACQ pid=5354 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:00.798936 sshd-session[5354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:00.802550 kernel: audit: type=1006 audit(1769489400.797:824): pid=5354 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 27 04:50:00.797000 audit[5354]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdbc1f7f0 a2=3 a3=0 items=0 ppid=1 pid=5354 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:00.805896 kernel: audit: type=1300 audit(1769489400.797:824): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdbc1f7f0 a2=3 a3=0 items=0 ppid=1 pid=5354 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:00.797000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:00.807174 kernel: audit: type=1327 audit(1769489400.797:824): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:00.811231 systemd-logind[1653]: New session 18 of user core. Jan 27 04:50:00.825892 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 27 04:50:00.828000 audit[5354]: USER_START pid=5354 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:00.830000 audit[5358]: CRED_ACQ pid=5358 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:00.834957 kernel: audit: type=1105 audit(1769489400.828:825): pid=5354 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:00.835056 kernel: audit: type=1103 audit(1769489400.830:826): pid=5358 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:01.160769 sshd[5358]: Connection closed by 4.153.228.146 port 47930 Jan 27 04:50:01.161458 sshd-session[5354]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:01.161000 audit[5354]: USER_END pid=5354 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:01.165774 systemd[1]: sshd@16-10.0.3.32:22-4.153.228.146:47930.service: Deactivated successfully. Jan 27 04:50:01.162000 audit[5354]: CRED_DISP pid=5354 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:01.167650 systemd[1]: session-18.scope: Deactivated successfully. Jan 27 04:50:01.168838 kernel: audit: type=1106 audit(1769489401.161:827): pid=5354 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:01.168963 kernel: audit: type=1104 audit(1769489401.162:828): pid=5354 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:01.164000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.32:22-4.153.228.146:47930 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:01.170903 systemd-logind[1653]: Session 18 logged out. Waiting for processes to exit. Jan 27 04:50:01.171803 systemd-logind[1653]: Removed session 18. Jan 27 04:50:01.271443 systemd[1]: Started sshd@17-10.0.3.32:22-4.153.228.146:47940.service - OpenSSH per-connection server daemon (4.153.228.146:47940). Jan 27 04:50:01.270000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.3.32:22-4.153.228.146:47940 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:01.814000 audit[5371]: USER_ACCT pid=5371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:01.815319 sshd[5371]: Accepted publickey for core from 4.153.228.146 port 47940 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:01.815000 audit[5371]: CRED_ACQ pid=5371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:01.815000 audit[5371]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff1a4d250 a2=3 a3=0 items=0 ppid=1 pid=5371 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:01.815000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:01.817201 sshd-session[5371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:01.823152 systemd-logind[1653]: New session 19 of user core. Jan 27 04:50:01.827376 kubelet[2905]: E0127 04:50:01.827323 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:50:01.829317 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 27 04:50:01.832000 audit[5371]: USER_START pid=5371 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:01.834000 audit[5375]: CRED_ACQ pid=5375 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:02.240944 sshd[5375]: Connection closed by 4.153.228.146 port 47940 Jan 27 04:50:02.240181 sshd-session[5371]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:02.240000 audit[5371]: USER_END pid=5371 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:02.241000 audit[5371]: CRED_DISP pid=5371 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:02.245084 systemd-logind[1653]: Session 19 logged out. Waiting for processes to exit. Jan 27 04:50:02.245855 systemd[1]: sshd@17-10.0.3.32:22-4.153.228.146:47940.service: Deactivated successfully. Jan 27 04:50:02.245000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.3.32:22-4.153.228.146:47940 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:02.247967 systemd[1]: session-19.scope: Deactivated successfully. Jan 27 04:50:02.251869 systemd-logind[1653]: Removed session 19. Jan 27 04:50:02.348141 systemd[1]: Started sshd@18-10.0.3.32:22-4.153.228.146:47956.service - OpenSSH per-connection server daemon (4.153.228.146:47956). Jan 27 04:50:02.347000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.3.32:22-4.153.228.146:47956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:02.826275 kubelet[2905]: E0127 04:50:02.826219 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:50:02.826275 kubelet[2905]: E0127 04:50:02.826226 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:50:02.889000 audit[5386]: USER_ACCT pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:02.891043 sshd[5386]: Accepted publickey for core from 4.153.228.146 port 47956 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:02.890000 audit[5386]: CRED_ACQ pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:02.890000 audit[5386]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeb447950 a2=3 a3=0 items=0 ppid=1 pid=5386 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:02.890000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:02.892813 sshd-session[5386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:02.897392 systemd-logind[1653]: New session 20 of user core. Jan 27 04:50:02.907466 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 27 04:50:02.908000 audit[5386]: USER_START pid=5386 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:02.910000 audit[5390]: CRED_ACQ pid=5390 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:03.733000 audit[5401]: NETFILTER_CFG table=filter:144 family=2 entries=14 op=nft_register_rule pid=5401 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:50:03.733000 audit[5401]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffcd2b0820 a2=0 a3=1 items=0 ppid=3073 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:03.733000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:50:03.741000 audit[5401]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5401 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:50:03.741000 audit[5401]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffcd2b0820 a2=0 a3=1 items=0 ppid=3073 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:03.741000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:50:03.758000 audit[5403]: NETFILTER_CFG table=filter:146 family=2 entries=26 op=nft_register_rule pid=5403 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:50:03.758000 audit[5403]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffd033ef30 a2=0 a3=1 items=0 ppid=3073 pid=5403 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:03.758000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:50:03.763000 audit[5403]: NETFILTER_CFG table=nat:147 family=2 entries=20 op=nft_register_rule pid=5403 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:50:03.763000 audit[5403]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffd033ef30 a2=0 a3=1 items=0 ppid=3073 pid=5403 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:03.763000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:50:03.836303 sshd[5390]: Connection closed by 4.153.228.146 port 47956 Jan 27 04:50:03.836627 sshd-session[5386]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:03.838000 audit[5386]: USER_END pid=5386 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:03.838000 audit[5386]: CRED_DISP pid=5386 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:03.841902 systemd[1]: sshd@18-10.0.3.32:22-4.153.228.146:47956.service: Deactivated successfully. Jan 27 04:50:03.841000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.3.32:22-4.153.228.146:47956 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:03.844129 systemd[1]: session-20.scope: Deactivated successfully. Jan 27 04:50:03.845787 systemd-logind[1653]: Session 20 logged out. Waiting for processes to exit. Jan 27 04:50:03.847474 systemd-logind[1653]: Removed session 20. Jan 27 04:50:03.950000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.3.32:22-4.153.228.146:47968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:03.951536 systemd[1]: Started sshd@19-10.0.3.32:22-4.153.228.146:47968.service - OpenSSH per-connection server daemon (4.153.228.146:47968). Jan 27 04:50:04.498000 audit[5410]: USER_ACCT pid=5410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:04.499981 sshd[5410]: Accepted publickey for core from 4.153.228.146 port 47968 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:04.499000 audit[5410]: CRED_ACQ pid=5410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:04.499000 audit[5410]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdbba4630 a2=3 a3=0 items=0 ppid=1 pid=5410 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:04.499000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:04.501779 sshd-session[5410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:04.507105 systemd-logind[1653]: New session 21 of user core. Jan 27 04:50:04.515359 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 27 04:50:04.517000 audit[5410]: USER_START pid=5410 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:04.519000 audit[5414]: CRED_ACQ pid=5414 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:04.957163 sshd[5414]: Connection closed by 4.153.228.146 port 47968 Jan 27 04:50:04.957465 sshd-session[5410]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:04.958000 audit[5410]: USER_END pid=5410 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:04.958000 audit[5410]: CRED_DISP pid=5410 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:04.962539 systemd[1]: sshd@19-10.0.3.32:22-4.153.228.146:47968.service: Deactivated successfully. Jan 27 04:50:04.963000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.3.32:22-4.153.228.146:47968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:04.965924 systemd[1]: session-21.scope: Deactivated successfully. Jan 27 04:50:04.968341 systemd-logind[1653]: Session 21 logged out. Waiting for processes to exit. Jan 27 04:50:04.969448 systemd-logind[1653]: Removed session 21. Jan 27 04:50:05.063417 systemd[1]: Started sshd@20-10.0.3.32:22-4.153.228.146:37494.service - OpenSSH per-connection server daemon (4.153.228.146:37494). Jan 27 04:50:05.062000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.3.32:22-4.153.228.146:37494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:05.576000 audit[5426]: USER_ACCT pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.578237 sshd[5426]: Accepted publickey for core from 4.153.228.146 port 37494 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:05.578601 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 27 04:50:05.578630 kernel: audit: type=1101 audit(1769489405.576:862): pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.581724 sshd-session[5426]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:05.579000 audit[5426]: CRED_ACQ pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.585002 kernel: audit: type=1103 audit(1769489405.579:863): pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.586978 kernel: audit: type=1006 audit(1769489405.579:864): pid=5426 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 27 04:50:05.579000 audit[5426]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd289b160 a2=3 a3=0 items=0 ppid=1 pid=5426 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:05.590942 kernel: audit: type=1300 audit(1769489405.579:864): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd289b160 a2=3 a3=0 items=0 ppid=1 pid=5426 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:05.591050 kernel: audit: type=1327 audit(1769489405.579:864): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:05.579000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:05.596484 systemd-logind[1653]: New session 22 of user core. Jan 27 04:50:05.604301 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 27 04:50:05.606000 audit[5426]: USER_START pid=5426 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.612627 kernel: audit: type=1105 audit(1769489405.606:865): pid=5426 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.612712 kernel: audit: type=1103 audit(1769489405.610:866): pid=5430 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.610000 audit[5430]: CRED_ACQ pid=5430 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.943114 sshd[5430]: Connection closed by 4.153.228.146 port 37494 Jan 27 04:50:05.943809 sshd-session[5426]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:05.943000 audit[5426]: USER_END pid=5426 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.944000 audit[5426]: CRED_DISP pid=5426 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.948020 systemd-logind[1653]: Session 22 logged out. Waiting for processes to exit. Jan 27 04:50:05.948216 systemd[1]: sshd@20-10.0.3.32:22-4.153.228.146:37494.service: Deactivated successfully. Jan 27 04:50:05.950022 systemd[1]: session-22.scope: Deactivated successfully. Jan 27 04:50:05.951984 kernel: audit: type=1106 audit(1769489405.943:867): pid=5426 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.952051 kernel: audit: type=1104 audit(1769489405.944:868): pid=5426 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:05.952072 kernel: audit: type=1131 audit(1769489405.947:869): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.3.32:22-4.153.228.146:37494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:05.947000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.3.32:22-4.153.228.146:37494 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:05.953007 systemd-logind[1653]: Removed session 22. Jan 27 04:50:06.826508 kubelet[2905]: E0127 04:50:06.826170 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:50:07.564000 audit[5443]: NETFILTER_CFG table=filter:148 family=2 entries=26 op=nft_register_rule pid=5443 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:50:07.564000 audit[5443]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc55a5f30 a2=0 a3=1 items=0 ppid=3073 pid=5443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:07.564000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:50:07.570000 audit[5443]: NETFILTER_CFG table=nat:149 family=2 entries=104 op=nft_register_chain pid=5443 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 27 04:50:07.570000 audit[5443]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffc55a5f30 a2=0 a3=1 items=0 ppid=3073 pid=5443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:07.570000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 27 04:50:07.828669 kubelet[2905]: E0127 04:50:07.828531 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:50:11.065155 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 27 04:50:11.065264 kernel: audit: type=1130 audit(1769489411.061:872): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.32:22-4.153.228.146:37496 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:11.061000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.32:22-4.153.228.146:37496 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:11.062132 systemd[1]: Started sshd@21-10.0.3.32:22-4.153.228.146:37496.service - OpenSSH per-connection server daemon (4.153.228.146:37496). Jan 27 04:50:11.607000 audit[5445]: USER_ACCT pid=5445 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.611937 sshd[5445]: Accepted publickey for core from 4.153.228.146 port 37496 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:11.610000 audit[5445]: CRED_ACQ pid=5445 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.612497 sshd-session[5445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:11.615224 kernel: audit: type=1101 audit(1769489411.607:873): pid=5445 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.615326 kernel: audit: type=1103 audit(1769489411.610:874): pid=5445 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.617152 kernel: audit: type=1006 audit(1769489411.610:875): pid=5445 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 27 04:50:11.610000 audit[5445]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe84fc030 a2=3 a3=0 items=0 ppid=1 pid=5445 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:11.620641 kernel: audit: type=1300 audit(1769489411.610:875): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe84fc030 a2=3 a3=0 items=0 ppid=1 pid=5445 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:11.610000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:11.622122 kernel: audit: type=1327 audit(1769489411.610:875): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:11.624156 systemd-logind[1653]: New session 23 of user core. Jan 27 04:50:11.635381 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 27 04:50:11.636000 audit[5445]: USER_START pid=5445 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.638000 audit[5451]: CRED_ACQ pid=5451 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.643822 kernel: audit: type=1105 audit(1769489411.636:876): pid=5445 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.644190 kernel: audit: type=1103 audit(1769489411.638:877): pid=5451 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.975467 sshd[5451]: Connection closed by 4.153.228.146 port 37496 Jan 27 04:50:11.976047 sshd-session[5445]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:11.976000 audit[5445]: USER_END pid=5445 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.980587 systemd[1]: sshd@21-10.0.3.32:22-4.153.228.146:37496.service: Deactivated successfully. Jan 27 04:50:11.976000 audit[5445]: CRED_DISP pid=5445 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.982786 systemd[1]: session-23.scope: Deactivated successfully. Jan 27 04:50:11.983385 kernel: audit: type=1106 audit(1769489411.976:878): pid=5445 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.983440 kernel: audit: type=1104 audit(1769489411.976:879): pid=5445 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:11.979000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.32:22-4.153.228.146:37496 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:11.983864 systemd-logind[1653]: Session 23 logged out. Waiting for processes to exit. Jan 27 04:50:11.985200 systemd-logind[1653]: Removed session 23. Jan 27 04:50:13.826689 kubelet[2905]: E0127 04:50:13.826601 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:50:14.827453 kubelet[2905]: E0127 04:50:14.827395 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:50:15.826586 kubelet[2905]: E0127 04:50:15.826016 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:50:16.825728 kubelet[2905]: E0127 04:50:16.825280 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:50:17.087076 systemd[1]: Started sshd@22-10.0.3.32:22-4.153.228.146:49296.service - OpenSSH per-connection server daemon (4.153.228.146:49296). Jan 27 04:50:17.086000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.32:22-4.153.228.146:49296 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:17.090346 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 04:50:17.090465 kernel: audit: type=1130 audit(1769489417.086:881): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.32:22-4.153.228.146:49296 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:17.606000 audit[5489]: USER_ACCT pid=5489 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.608169 sshd[5489]: Accepted publickey for core from 4.153.228.146 port 49296 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:17.611124 kernel: audit: type=1101 audit(1769489417.606:882): pid=5489 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.610000 audit[5489]: CRED_ACQ pid=5489 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.612819 sshd-session[5489]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:17.616699 kernel: audit: type=1103 audit(1769489417.610:883): pid=5489 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.616772 kernel: audit: type=1006 audit(1769489417.611:884): pid=5489 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 27 04:50:17.611000 audit[5489]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd51cd230 a2=3 a3=0 items=0 ppid=1 pid=5489 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:17.620425 kernel: audit: type=1300 audit(1769489417.611:884): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd51cd230 a2=3 a3=0 items=0 ppid=1 pid=5489 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:17.611000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:17.621644 kernel: audit: type=1327 audit(1769489417.611:884): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:17.624619 systemd-logind[1653]: New session 24 of user core. Jan 27 04:50:17.636789 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 27 04:50:17.638000 audit[5489]: USER_START pid=5489 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.641000 audit[5493]: CRED_ACQ pid=5493 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.645588 kernel: audit: type=1105 audit(1769489417.638:885): pid=5489 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.645664 kernel: audit: type=1103 audit(1769489417.641:886): pid=5493 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.967082 sshd[5493]: Connection closed by 4.153.228.146 port 49296 Jan 27 04:50:17.967372 sshd-session[5489]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:17.968000 audit[5489]: USER_END pid=5489 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.973016 systemd[1]: sshd@22-10.0.3.32:22-4.153.228.146:49296.service: Deactivated successfully. Jan 27 04:50:17.968000 audit[5489]: CRED_DISP pid=5489 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.976141 kernel: audit: type=1106 audit(1769489417.968:887): pid=5489 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.976217 kernel: audit: type=1104 audit(1769489417.968:888): pid=5489 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:17.974000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.32:22-4.153.228.146:49296 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:17.977614 systemd[1]: session-24.scope: Deactivated successfully. Jan 27 04:50:17.981823 systemd-logind[1653]: Session 24 logged out. Waiting for processes to exit. Jan 27 04:50:17.982999 systemd-logind[1653]: Removed session 24. Jan 27 04:50:18.826216 kubelet[2905]: E0127 04:50:18.826164 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:50:20.826540 kubelet[2905]: E0127 04:50:20.826494 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:50:23.078586 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 04:50:23.078702 kernel: audit: type=1130 audit(1769489423.075:890): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.32:22-4.153.228.146:49298 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:23.075000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.32:22-4.153.228.146:49298 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:23.075382 systemd[1]: Started sshd@23-10.0.3.32:22-4.153.228.146:49298.service - OpenSSH per-connection server daemon (4.153.228.146:49298). Jan 27 04:50:23.596000 audit[5506]: USER_ACCT pid=5506 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.600303 kernel: audit: type=1101 audit(1769489423.596:891): pid=5506 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.600345 sshd[5506]: Accepted publickey for core from 4.153.228.146 port 49298 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:23.600000 audit[5506]: CRED_ACQ pid=5506 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.601896 sshd-session[5506]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:23.605607 kernel: audit: type=1103 audit(1769489423.600:892): pid=5506 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.605740 kernel: audit: type=1006 audit(1769489423.601:893): pid=5506 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 27 04:50:23.601000 audit[5506]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffef519b60 a2=3 a3=0 items=0 ppid=1 pid=5506 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:23.608163 systemd-logind[1653]: New session 25 of user core. Jan 27 04:50:23.608899 kernel: audit: type=1300 audit(1769489423.601:893): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffef519b60 a2=3 a3=0 items=0 ppid=1 pid=5506 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:23.601000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:23.610044 kernel: audit: type=1327 audit(1769489423.601:893): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:23.618347 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 27 04:50:23.620000 audit[5506]: USER_START pid=5506 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.624000 audit[5516]: CRED_ACQ pid=5516 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.627266 kernel: audit: type=1105 audit(1769489423.620:894): pid=5506 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.627353 kernel: audit: type=1103 audit(1769489423.624:895): pid=5516 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.953195 sshd[5516]: Connection closed by 4.153.228.146 port 49298 Jan 27 04:50:23.953915 sshd-session[5506]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:23.955000 audit[5506]: USER_END pid=5506 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.958920 systemd-logind[1653]: Session 25 logged out. Waiting for processes to exit. Jan 27 04:50:23.959057 systemd[1]: sshd@23-10.0.3.32:22-4.153.228.146:49298.service: Deactivated successfully. Jan 27 04:50:23.956000 audit[5506]: CRED_DISP pid=5506 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.961216 systemd[1]: session-25.scope: Deactivated successfully. Jan 27 04:50:23.962005 kernel: audit: type=1106 audit(1769489423.955:896): pid=5506 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.962077 kernel: audit: type=1104 audit(1769489423.956:897): pid=5506 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:23.959000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.32:22-4.153.228.146:49298 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:23.963828 systemd-logind[1653]: Removed session 25. Jan 27 04:50:26.826371 kubelet[2905]: E0127 04:50:26.826317 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:50:26.827318 kubelet[2905]: E0127 04:50:26.827271 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:50:28.825689 kubelet[2905]: E0127 04:50:28.825608 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:50:29.071418 systemd[1]: Started sshd@24-10.0.3.32:22-4.153.228.146:49610.service - OpenSSH per-connection server daemon (4.153.228.146:49610). Jan 27 04:50:29.071000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.32:22-4.153.228.146:49610 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:29.072552 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 04:50:29.072592 kernel: audit: type=1130 audit(1769489429.071:899): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.32:22-4.153.228.146:49610 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:29.609000 audit[5531]: USER_ACCT pid=5531 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:29.609512 sshd[5531]: Accepted publickey for core from 4.153.228.146 port 49610 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:29.613802 sshd-session[5531]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:29.612000 audit[5531]: CRED_ACQ pid=5531 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:29.616594 kernel: audit: type=1101 audit(1769489429.609:900): pid=5531 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:29.616711 kernel: audit: type=1103 audit(1769489429.612:901): pid=5531 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:29.619298 kernel: audit: type=1006 audit(1769489429.613:902): pid=5531 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 27 04:50:29.619381 kernel: audit: type=1300 audit(1769489429.613:902): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd81be810 a2=3 a3=0 items=0 ppid=1 pid=5531 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:29.613000 audit[5531]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd81be810 a2=3 a3=0 items=0 ppid=1 pid=5531 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:29.613000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:29.623785 kernel: audit: type=1327 audit(1769489429.613:902): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:29.626140 systemd-logind[1653]: New session 26 of user core. Jan 27 04:50:29.636519 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 27 04:50:29.639000 audit[5531]: USER_START pid=5531 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:29.640000 audit[5535]: CRED_ACQ pid=5535 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:29.646010 kernel: audit: type=1105 audit(1769489429.639:903): pid=5531 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:29.646078 kernel: audit: type=1103 audit(1769489429.640:904): pid=5535 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:29.998521 sshd[5535]: Connection closed by 4.153.228.146 port 49610 Jan 27 04:50:29.997573 sshd-session[5531]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:29.999000 audit[5531]: USER_END pid=5531 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:30.000000 audit[5531]: CRED_DISP pid=5531 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:30.003989 systemd[1]: sshd@24-10.0.3.32:22-4.153.228.146:49610.service: Deactivated successfully. Jan 27 04:50:30.005799 systemd[1]: session-26.scope: Deactivated successfully. Jan 27 04:50:30.005900 kernel: audit: type=1106 audit(1769489429.999:905): pid=5531 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:30.005933 kernel: audit: type=1104 audit(1769489430.000:906): pid=5531 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:30.004000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.32:22-4.153.228.146:49610 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:30.006771 systemd-logind[1653]: Session 26 logged out. Waiting for processes to exit. Jan 27 04:50:30.007790 systemd-logind[1653]: Removed session 26. Jan 27 04:50:30.825935 kubelet[2905]: E0127 04:50:30.825878 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:50:31.826542 kubelet[2905]: E0127 04:50:31.826313 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:50:31.827579 kubelet[2905]: E0127 04:50:31.827190 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:50:35.103948 systemd[1]: Started sshd@25-10.0.3.32:22-4.153.228.146:45084.service - OpenSSH per-connection server daemon (4.153.228.146:45084). Jan 27 04:50:35.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.32:22-4.153.228.146:45084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:35.107448 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 04:50:35.107501 kernel: audit: type=1130 audit(1769489435.103:908): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.32:22-4.153.228.146:45084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:35.628000 audit[5550]: USER_ACCT pid=5550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:35.629276 sshd[5550]: Accepted publickey for core from 4.153.228.146 port 45084 ssh2: RSA SHA256:hgDJPeUXepxfsBBA8l5U8ytxOXEo3vcE3SUEcMFaKbI Jan 27 04:50:35.632129 kernel: audit: type=1101 audit(1769489435.628:909): pid=5550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:35.633178 kernel: audit: type=1103 audit(1769489435.632:910): pid=5550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:35.632000 audit[5550]: CRED_ACQ pid=5550 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:35.632902 sshd-session[5550]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 27 04:50:35.637033 kernel: audit: type=1006 audit(1769489435.632:911): pid=5550 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 27 04:50:35.637136 kernel: audit: type=1300 audit(1769489435.632:911): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcd4b17e0 a2=3 a3=0 items=0 ppid=1 pid=5550 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:35.632000 audit[5550]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcd4b17e0 a2=3 a3=0 items=0 ppid=1 pid=5550 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:50:35.638574 systemd-logind[1653]: New session 27 of user core. Jan 27 04:50:35.632000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:35.641030 kernel: audit: type=1327 audit(1769489435.632:911): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 27 04:50:35.641295 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 27 04:50:35.644000 audit[5550]: USER_START pid=5550 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:35.645000 audit[5554]: CRED_ACQ pid=5554 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:35.649871 kernel: audit: type=1105 audit(1769489435.644:912): pid=5550 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:35.649954 kernel: audit: type=1103 audit(1769489435.645:913): pid=5554 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:36.019499 sshd[5554]: Connection closed by 4.153.228.146 port 45084 Jan 27 04:50:36.020229 sshd-session[5550]: pam_unix(sshd:session): session closed for user core Jan 27 04:50:36.022000 audit[5550]: USER_END pid=5550 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:36.025359 systemd[1]: sshd@25-10.0.3.32:22-4.153.228.146:45084.service: Deactivated successfully. Jan 27 04:50:36.022000 audit[5550]: CRED_DISP pid=5550 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:36.027289 systemd[1]: session-27.scope: Deactivated successfully. Jan 27 04:50:36.028080 systemd-logind[1653]: Session 27 logged out. Waiting for processes to exit. Jan 27 04:50:36.028873 kernel: audit: type=1106 audit(1769489436.022:914): pid=5550 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:36.028938 kernel: audit: type=1104 audit(1769489436.022:915): pid=5550 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 27 04:50:36.024000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.32:22-4.153.228.146:45084 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 27 04:50:36.029678 systemd-logind[1653]: Removed session 27. Jan 27 04:50:39.828861 kubelet[2905]: E0127 04:50:39.828817 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:50:41.825599 kubelet[2905]: E0127 04:50:41.825543 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:50:41.826060 containerd[1666]: time="2026-01-27T04:50:41.825682003Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 27 04:50:42.177165 containerd[1666]: time="2026-01-27T04:50:42.177018635Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:50:42.181346 containerd[1666]: time="2026-01-27T04:50:42.181085776Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 27 04:50:42.181346 containerd[1666]: time="2026-01-27T04:50:42.181142217Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 27 04:50:42.181479 kubelet[2905]: E0127 04:50:42.181441 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:50:42.181517 kubelet[2905]: E0127 04:50:42.181494 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 27 04:50:42.181650 kubelet[2905]: E0127 04:50:42.181606 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1c7c961a1fbc4629aa41d023389e3c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 27 04:50:42.183592 containerd[1666]: time="2026-01-27T04:50:42.183550149Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 27 04:50:42.521124 containerd[1666]: time="2026-01-27T04:50:42.521028711Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:50:42.522967 containerd[1666]: time="2026-01-27T04:50:42.522917920Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 27 04:50:42.523081 containerd[1666]: time="2026-01-27T04:50:42.522962360Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 27 04:50:42.524287 kubelet[2905]: E0127 04:50:42.524227 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:50:42.524348 kubelet[2905]: E0127 04:50:42.524289 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 27 04:50:42.524460 kubelet[2905]: E0127 04:50:42.524401 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvk54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-68746999d7-v2tpn_calico-system(a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 27 04:50:42.525716 kubelet[2905]: E0127 04:50:42.525656 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:50:44.826591 containerd[1666]: time="2026-01-27T04:50:44.826340752Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 27 04:50:45.162683 containerd[1666]: time="2026-01-27T04:50:45.162523507Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:50:45.164125 containerd[1666]: time="2026-01-27T04:50:45.164073835Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 27 04:50:45.164255 containerd[1666]: time="2026-01-27T04:50:45.164156115Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 27 04:50:45.164352 kubelet[2905]: E0127 04:50:45.164289 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:50:45.164352 kubelet[2905]: E0127 04:50:45.164337 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 27 04:50:45.164652 kubelet[2905]: E0127 04:50:45.164450 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 27 04:50:45.166397 containerd[1666]: time="2026-01-27T04:50:45.166375767Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 27 04:50:45.497354 containerd[1666]: time="2026-01-27T04:50:45.497282055Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:50:45.499054 containerd[1666]: time="2026-01-27T04:50:45.499004184Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 27 04:50:45.499163 containerd[1666]: time="2026-01-27T04:50:45.499041024Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 27 04:50:45.499299 kubelet[2905]: E0127 04:50:45.499247 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:50:45.499299 kubelet[2905]: E0127 04:50:45.499295 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 27 04:50:45.499458 kubelet[2905]: E0127 04:50:45.499419 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsdqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-vk94b_calico-system(56e3e6d0-7a6b-4ba3-9081-3231ea811709): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 27 04:50:45.500621 kubelet[2905]: E0127 04:50:45.500566 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:50:45.830913 containerd[1666]: time="2026-01-27T04:50:45.830735316Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 27 04:50:46.185238 containerd[1666]: time="2026-01-27T04:50:46.185068844Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:50:46.186595 containerd[1666]: time="2026-01-27T04:50:46.186539571Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 27 04:50:46.186690 containerd[1666]: time="2026-01-27T04:50:46.186594612Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 27 04:50:46.186843 kubelet[2905]: E0127 04:50:46.186787 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:50:46.187138 kubelet[2905]: E0127 04:50:46.186841 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 27 04:50:46.187138 kubelet[2905]: E0127 04:50:46.187061 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmpx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-799789d486-wpdhm_calico-system(5edc651f-3273-46b9-a554-3e38c11ea910): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 27 04:50:46.187524 containerd[1666]: time="2026-01-27T04:50:46.187486016Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:50:46.188588 kubelet[2905]: E0127 04:50:46.188515 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:50:46.534339 containerd[1666]: time="2026-01-27T04:50:46.534223625Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:50:46.535871 containerd[1666]: time="2026-01-27T04:50:46.535807073Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:50:46.535915 containerd[1666]: time="2026-01-27T04:50:46.535854314Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:50:46.536075 kubelet[2905]: E0127 04:50:46.536024 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:50:46.536075 kubelet[2905]: E0127 04:50:46.536071 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:50:46.537260 kubelet[2905]: E0127 04:50:46.536540 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tvs85,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-qg4kw_calico-apiserver(e725b346-d7db-48ca-8580-5074b068cd87): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:50:46.537774 kubelet[2905]: E0127 04:50:46.537735 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:50:52.826521 containerd[1666]: time="2026-01-27T04:50:52.826243166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 27 04:50:53.165794 containerd[1666]: time="2026-01-27T04:50:53.165657098Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:50:53.169237 containerd[1666]: time="2026-01-27T04:50:53.169162196Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 27 04:50:53.169335 containerd[1666]: time="2026-01-27T04:50:53.169163996Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 27 04:50:53.169468 kubelet[2905]: E0127 04:50:53.169428 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:50:53.169740 kubelet[2905]: E0127 04:50:53.169480 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 27 04:50:53.169740 kubelet[2905]: E0127 04:50:53.169608 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zbn74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-trbp2_calico-system(1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 27 04:50:53.171215 kubelet[2905]: E0127 04:50:53.171000 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:50:55.826892 containerd[1666]: time="2026-01-27T04:50:55.826789314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 27 04:50:56.166330 containerd[1666]: time="2026-01-27T04:50:56.166063725Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 27 04:50:56.167763 containerd[1666]: time="2026-01-27T04:50:56.167642693Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 27 04:50:56.167959 containerd[1666]: time="2026-01-27T04:50:56.167735854Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 27 04:50:56.168035 kubelet[2905]: E0127 04:50:56.167978 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:50:56.168334 kubelet[2905]: E0127 04:50:56.168042 2905 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 27 04:50:56.168334 kubelet[2905]: E0127 04:50:56.168222 2905 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wb7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-b5b8d765d-58pbp_calico-apiserver(b46e8e69-6e20-4188-9b8d-4e06490f6e72): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 27 04:50:56.169538 kubelet[2905]: E0127 04:50:56.169461 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:50:56.826441 kubelet[2905]: E0127 04:50:56.826387 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:50:57.827074 kubelet[2905]: E0127 04:50:57.827025 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:50:57.829302 kubelet[2905]: E0127 04:50:57.829264 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:51:00.826358 kubelet[2905]: E0127 04:51:00.826313 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:51:03.829035 kubelet[2905]: E0127 04:51:03.828945 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b" Jan 27 04:51:04.591627 systemd[1]: cri-containerd-e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a.scope: Deactivated successfully. Jan 27 04:51:04.592399 systemd[1]: cri-containerd-e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a.scope: Consumed 39.799s CPU time, 119.1M memory peak. Jan 27 04:51:04.593381 containerd[1666]: time="2026-01-27T04:51:04.593067759Z" level=info msg="received container exit event container_id:\"e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a\" id:\"e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a\" pid:3239 exit_status:1 exited_at:{seconds:1769489464 nanos:592743957}" Jan 27 04:51:04.595000 audit: BPF prog-id=149 op=UNLOAD Jan 27 04:51:04.597773 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 27 04:51:04.597838 kernel: audit: type=1334 audit(1769489464.595:917): prog-id=149 op=UNLOAD Jan 27 04:51:04.595000 audit: BPF prog-id=153 op=UNLOAD Jan 27 04:51:04.598886 kernel: audit: type=1334 audit(1769489464.595:918): prog-id=153 op=UNLOAD Jan 27 04:51:04.613591 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a-rootfs.mount: Deactivated successfully. Jan 27 04:51:05.046028 kubelet[2905]: E0127 04:51:05.045987 2905 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.32:54554->10.0.3.116:2379: read: connection timed out" Jan 27 04:51:05.192000 audit: BPF prog-id=259 op=LOAD Jan 27 04:51:05.192568 systemd[1]: cri-containerd-faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8.scope: Deactivated successfully. Jan 27 04:51:05.193011 systemd[1]: cri-containerd-faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8.scope: Consumed 4.303s CPU time, 60.9M memory peak. Jan 27 04:51:05.192000 audit: BPF prog-id=96 op=UNLOAD Jan 27 04:51:05.195216 kernel: audit: type=1334 audit(1769489465.192:919): prog-id=259 op=LOAD Jan 27 04:51:05.195275 kernel: audit: type=1334 audit(1769489465.192:920): prog-id=96 op=UNLOAD Jan 27 04:51:05.195942 containerd[1666]: time="2026-01-27T04:51:05.195833914Z" level=info msg="received container exit event container_id:\"faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8\" id:\"faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8\" pid:2767 exit_status:1 exited_at:{seconds:1769489465 nanos:195544872}" Jan 27 04:51:05.200000 audit: BPF prog-id=111 op=UNLOAD Jan 27 04:51:05.200000 audit: BPF prog-id=115 op=UNLOAD Jan 27 04:51:05.202729 kernel: audit: type=1334 audit(1769489465.200:921): prog-id=111 op=UNLOAD Jan 27 04:51:05.202791 kernel: audit: type=1334 audit(1769489465.200:922): prog-id=115 op=UNLOAD Jan 27 04:51:05.216662 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8-rootfs.mount: Deactivated successfully. Jan 27 04:51:05.417061 kubelet[2905]: I0127 04:51:05.416945 2905 scope.go:117] "RemoveContainer" containerID="e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a" Jan 27 04:51:05.419356 kubelet[2905]: I0127 04:51:05.418679 2905 scope.go:117] "RemoveContainer" containerID="faeceb7f650cae6a2fa2c1ee6b11027e47f7d4bc28288c3c796884cf402156f8" Jan 27 04:51:05.419450 containerd[1666]: time="2026-01-27T04:51:05.418795851Z" level=info msg="CreateContainer within sandbox \"462971cd0ca9b64c54265c1d210e0f530dbcad764ab4382b080bde39bff7a456\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 27 04:51:05.421251 containerd[1666]: time="2026-01-27T04:51:05.421219784Z" level=info msg="CreateContainer within sandbox \"1acf1486575e86539efe0574e057fe5b875e35a867014752d29000ebecb77924\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 27 04:51:05.438191 containerd[1666]: time="2026-01-27T04:51:05.438141430Z" level=info msg="Container eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:51:05.444569 containerd[1666]: time="2026-01-27T04:51:05.444522703Z" level=info msg="Container 93f98c508e5386ed04b25aebf40917960adf40cf3d82b7186f50f06da64472e6: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:51:05.449790 containerd[1666]: time="2026-01-27T04:51:05.449733449Z" level=info msg="CreateContainer within sandbox \"462971cd0ca9b64c54265c1d210e0f530dbcad764ab4382b080bde39bff7a456\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02\"" Jan 27 04:51:05.450270 containerd[1666]: time="2026-01-27T04:51:05.450194491Z" level=info msg="StartContainer for \"eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02\"" Jan 27 04:51:05.451116 containerd[1666]: time="2026-01-27T04:51:05.451053416Z" level=info msg="connecting to shim eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02" address="unix:///run/containerd/s/a6e460059dce338f750180ed762b1e57675fc463872533e19d5269084708bebb" protocol=ttrpc version=3 Jan 27 04:51:05.455203 containerd[1666]: time="2026-01-27T04:51:05.455154717Z" level=info msg="CreateContainer within sandbox \"1acf1486575e86539efe0574e057fe5b875e35a867014752d29000ebecb77924\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"93f98c508e5386ed04b25aebf40917960adf40cf3d82b7186f50f06da64472e6\"" Jan 27 04:51:05.455731 containerd[1666]: time="2026-01-27T04:51:05.455669119Z" level=info msg="StartContainer for \"93f98c508e5386ed04b25aebf40917960adf40cf3d82b7186f50f06da64472e6\"" Jan 27 04:51:05.456817 containerd[1666]: time="2026-01-27T04:51:05.456791125Z" level=info msg="connecting to shim 93f98c508e5386ed04b25aebf40917960adf40cf3d82b7186f50f06da64472e6" address="unix:///run/containerd/s/21c2f934621c4351eeaeac0c13b52aaa9db3181146d7cc55b04d404ddb3390bf" protocol=ttrpc version=3 Jan 27 04:51:05.471340 systemd[1]: Started cri-containerd-eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02.scope - libcontainer container eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02. Jan 27 04:51:05.474459 systemd[1]: Started cri-containerd-93f98c508e5386ed04b25aebf40917960adf40cf3d82b7186f50f06da64472e6.scope - libcontainer container 93f98c508e5386ed04b25aebf40917960adf40cf3d82b7186f50f06da64472e6. Jan 27 04:51:05.482000 audit: BPF prog-id=260 op=LOAD Jan 27 04:51:05.483000 audit: BPF prog-id=261 op=LOAD Jan 27 04:51:05.485125 kernel: audit: type=1334 audit(1769489465.482:923): prog-id=260 op=LOAD Jan 27 04:51:05.485197 kernel: audit: type=1334 audit(1769489465.483:924): prog-id=261 op=LOAD Jan 27 04:51:05.485227 kernel: audit: type=1300 audit(1769489465.483:924): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2978 pid=5642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.483000 audit[5642]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2978 pid=5642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653833643739356534666630316466636162666434313938346165 Jan 27 04:51:05.491558 kernel: audit: type=1327 audit(1769489465.483:924): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653833643739356534666630316466636162666434313938346165 Jan 27 04:51:05.483000 audit: BPF prog-id=261 op=UNLOAD Jan 27 04:51:05.483000 audit[5642]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=5642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653833643739356534666630316466636162666434313938346165 Jan 27 04:51:05.483000 audit: BPF prog-id=262 op=LOAD Jan 27 04:51:05.483000 audit[5642]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2978 pid=5642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.483000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653833643739356534666630316466636162666434313938346165 Jan 27 04:51:05.484000 audit: BPF prog-id=263 op=LOAD Jan 27 04:51:05.484000 audit[5642]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2978 pid=5642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653833643739356534666630316466636162666434313938346165 Jan 27 04:51:05.487000 audit: BPF prog-id=263 op=UNLOAD Jan 27 04:51:05.487000 audit[5642]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=5642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.487000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653833643739356534666630316466636162666434313938346165 Jan 27 04:51:05.487000 audit: BPF prog-id=262 op=UNLOAD Jan 27 04:51:05.487000 audit[5642]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2978 pid=5642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.487000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653833643739356534666630316466636162666434313938346165 Jan 27 04:51:05.487000 audit: BPF prog-id=264 op=LOAD Jan 27 04:51:05.487000 audit[5642]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2978 pid=5642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.487000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6561653833643739356534666630316466636162666434313938346165 Jan 27 04:51:05.491000 audit: BPF prog-id=265 op=LOAD Jan 27 04:51:05.492000 audit: BPF prog-id=266 op=LOAD Jan 27 04:51:05.492000 audit[5648]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2582 pid=5648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.492000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933663938633530386535333836656430346232356165626634303931 Jan 27 04:51:05.493000 audit: BPF prog-id=266 op=UNLOAD Jan 27 04:51:05.493000 audit[5648]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=5648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.493000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933663938633530386535333836656430346232356165626634303931 Jan 27 04:51:05.493000 audit: BPF prog-id=267 op=LOAD Jan 27 04:51:05.493000 audit[5648]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2582 pid=5648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.493000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933663938633530386535333836656430346232356165626634303931 Jan 27 04:51:05.493000 audit: BPF prog-id=268 op=LOAD Jan 27 04:51:05.493000 audit[5648]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2582 pid=5648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.493000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933663938633530386535333836656430346232356165626634303931 Jan 27 04:51:05.493000 audit: BPF prog-id=268 op=UNLOAD Jan 27 04:51:05.493000 audit[5648]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=5648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.493000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933663938633530386535333836656430346232356165626634303931 Jan 27 04:51:05.493000 audit: BPF prog-id=267 op=UNLOAD Jan 27 04:51:05.493000 audit[5648]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=5648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.493000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933663938633530386535333836656430346232356165626634303931 Jan 27 04:51:05.493000 audit: BPF prog-id=269 op=LOAD Jan 27 04:51:05.493000 audit[5648]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2582 pid=5648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:05.493000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933663938633530386535333836656430346232356165626634303931 Jan 27 04:51:05.512382 containerd[1666]: time="2026-01-27T04:51:05.512337609Z" level=info msg="StartContainer for \"eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02\" returns successfully" Jan 27 04:51:05.527314 containerd[1666]: time="2026-01-27T04:51:05.527275925Z" level=info msg="StartContainer for \"93f98c508e5386ed04b25aebf40917960adf40cf3d82b7186f50f06da64472e6\" returns successfully" Jan 27 04:51:05.616293 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount6853426.mount: Deactivated successfully. Jan 27 04:51:06.262030 kubelet[2905]: E0127 04:51:06.261897 2905 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.32:54358->10.0.3.116:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-b5b8d765d-58pbp.188e7d1debcaf1aa calico-apiserver 1365 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-b5b8d765d-58pbp,UID:b46e8e69-6e20-4188-9b8d-4e06490f6e72,APIVersion:v1,ResourceVersion:813,FieldPath:spec.containers{calico-apiserver},},Reason:Pulling,Message:Pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4592-0-0-n-c2731c5fad,},FirstTimestamp:2026-01-27 04:47:51 +0000 UTC,LastTimestamp:2026-01-27 04:50:55.825468948 +0000 UTC m=+232.083080401,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4592-0-0-n-c2731c5fad,}" Jan 27 04:51:07.285434 kubelet[2905]: I0127 04:51:07.285376 2905 status_manager.go:890] "Failed to get status for pod" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" pod="calico-system/whisker-68746999d7-v2tpn" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.32:54474->10.0.3.116:2379: read: connection timed out" Jan 27 04:51:08.825728 kubelet[2905]: E0127 04:51:08.825546 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-58pbp" podUID="b46e8e69-6e20-4188-9b8d-4e06490f6e72" Jan 27 04:51:08.826491 kubelet[2905]: E0127 04:51:08.826437 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68746999d7-v2tpn" podUID="a9e11ad0-9b2a-4daf-88aa-0d2f20b9fa33" Jan 27 04:51:09.826930 kubelet[2905]: E0127 04:51:09.826849 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-vk94b" podUID="56e3e6d0-7a6b-4ba3-9081-3231ea811709" Jan 27 04:51:10.969321 systemd[1]: cri-containerd-8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379.scope: Deactivated successfully. Jan 27 04:51:10.969641 systemd[1]: cri-containerd-8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379.scope: Consumed 3.558s CPU time, 23.6M memory peak. Jan 27 04:51:10.970000 audit: BPF prog-id=270 op=LOAD Jan 27 04:51:10.971853 containerd[1666]: time="2026-01-27T04:51:10.971819302Z" level=info msg="received container exit event container_id:\"8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379\" id:\"8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379\" pid:2720 exit_status:1 exited_at:{seconds:1769489470 nanos:971416140}" Jan 27 04:51:10.972330 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 27 04:51:10.972379 kernel: audit: type=1334 audit(1769489470.970:939): prog-id=270 op=LOAD Jan 27 04:51:10.972398 kernel: audit: type=1334 audit(1769489470.970:940): prog-id=91 op=UNLOAD Jan 27 04:51:10.970000 audit: BPF prog-id=91 op=UNLOAD Jan 27 04:51:10.973000 audit: BPF prog-id=101 op=UNLOAD Jan 27 04:51:10.973000 audit: BPF prog-id=105 op=UNLOAD Jan 27 04:51:10.975740 kernel: audit: type=1334 audit(1769489470.973:941): prog-id=101 op=UNLOAD Jan 27 04:51:10.975790 kernel: audit: type=1334 audit(1769489470.973:942): prog-id=105 op=UNLOAD Jan 27 04:51:10.992494 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379-rootfs.mount: Deactivated successfully. Jan 27 04:51:11.437247 kubelet[2905]: I0127 04:51:11.436781 2905 scope.go:117] "RemoveContainer" containerID="8b800740a81a7b9f349a3cd5a2ccc68af54c45769cf118158a697e47ee982379" Jan 27 04:51:11.438739 containerd[1666]: time="2026-01-27T04:51:11.438688524Z" level=info msg="CreateContainer within sandbox \"a5054b363b167c4dcd17a989241103c1998aa7752a8c543188088b6a45394437\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 27 04:51:11.459133 containerd[1666]: time="2026-01-27T04:51:11.458968707Z" level=info msg="Container 07de296990f873698d4add93d383324ac883f052ebe1caabab0d157938eb8091: CDI devices from CRI Config.CDIDevices: []" Jan 27 04:51:11.470270 containerd[1666]: time="2026-01-27T04:51:11.470211485Z" level=info msg="CreateContainer within sandbox \"a5054b363b167c4dcd17a989241103c1998aa7752a8c543188088b6a45394437\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"07de296990f873698d4add93d383324ac883f052ebe1caabab0d157938eb8091\"" Jan 27 04:51:11.470803 containerd[1666]: time="2026-01-27T04:51:11.470756407Z" level=info msg="StartContainer for \"07de296990f873698d4add93d383324ac883f052ebe1caabab0d157938eb8091\"" Jan 27 04:51:11.471824 containerd[1666]: time="2026-01-27T04:51:11.471787893Z" level=info msg="connecting to shim 07de296990f873698d4add93d383324ac883f052ebe1caabab0d157938eb8091" address="unix:///run/containerd/s/6a8e1897254a49e2cdc3dc150020478f7d681d6af7ea52f06624d749ef58ad7f" protocol=ttrpc version=3 Jan 27 04:51:11.490461 systemd[1]: Started cri-containerd-07de296990f873698d4add93d383324ac883f052ebe1caabab0d157938eb8091.scope - libcontainer container 07de296990f873698d4add93d383324ac883f052ebe1caabab0d157938eb8091. Jan 27 04:51:11.500000 audit: BPF prog-id=271 op=LOAD Jan 27 04:51:11.507135 kernel: audit: type=1334 audit(1769489471.500:943): prog-id=271 op=LOAD Jan 27 04:51:11.507297 kernel: audit: type=1334 audit(1769489471.501:944): prog-id=272 op=LOAD Jan 27 04:51:11.507391 kernel: audit: type=1300 audit(1769489471.501:944): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2613 pid=5721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:11.507477 kernel: audit: type=1327 audit(1769489471.501:944): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037646532393639393066383733363938643461646439336433383333 Jan 27 04:51:11.501000 audit: BPF prog-id=272 op=LOAD Jan 27 04:51:11.501000 audit[5721]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2613 pid=5721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:11.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037646532393639393066383733363938643461646439336433383333 Jan 27 04:51:11.502000 audit: BPF prog-id=272 op=UNLOAD Jan 27 04:51:11.510610 kernel: audit: type=1334 audit(1769489471.502:945): prog-id=272 op=UNLOAD Jan 27 04:51:11.510646 kernel: audit: type=1300 audit(1769489471.502:945): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=5721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:11.502000 audit[5721]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=5721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:11.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037646532393639393066383733363938643461646439336433383333 Jan 27 04:51:11.502000 audit: BPF prog-id=273 op=LOAD Jan 27 04:51:11.502000 audit[5721]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2613 pid=5721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:11.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037646532393639393066383733363938643461646439336433383333 Jan 27 04:51:11.502000 audit: BPF prog-id=274 op=LOAD Jan 27 04:51:11.502000 audit[5721]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2613 pid=5721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:11.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037646532393639393066383733363938643461646439336433383333 Jan 27 04:51:11.502000 audit: BPF prog-id=274 op=UNLOAD Jan 27 04:51:11.502000 audit[5721]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=5721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:11.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037646532393639393066383733363938643461646439336433383333 Jan 27 04:51:11.502000 audit: BPF prog-id=273 op=UNLOAD Jan 27 04:51:11.502000 audit[5721]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2613 pid=5721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:11.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037646532393639393066383733363938643461646439336433383333 Jan 27 04:51:11.502000 audit: BPF prog-id=275 op=LOAD Jan 27 04:51:11.502000 audit[5721]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2613 pid=5721 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 27 04:51:11.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037646532393639393066383733363938643461646439336433383333 Jan 27 04:51:11.539061 containerd[1666]: time="2026-01-27T04:51:11.538945155Z" level=info msg="StartContainer for \"07de296990f873698d4add93d383324ac883f052ebe1caabab0d157938eb8091\" returns successfully" Jan 27 04:51:11.828334 kubelet[2905]: E0127 04:51:11.828287 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b5b8d765d-qg4kw" podUID="e725b346-d7db-48ca-8580-5074b068cd87" Jan 27 04:51:12.825166 kubelet[2905]: E0127 04:51:12.825059 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-799789d486-wpdhm" podUID="5edc651f-3273-46b9-a554-3e38c11ea910" Jan 27 04:51:15.046504 kubelet[2905]: E0127 04:51:15.046441 2905 controller.go:195] "Failed to update lease" err="Put \"https://10.0.3.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4592-0-0-n-c2731c5fad?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 04:51:16.707056 systemd[1]: cri-containerd-eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02.scope: Deactivated successfully. Jan 27 04:51:16.708518 containerd[1666]: time="2026-01-27T04:51:16.708470209Z" level=info msg="received container exit event container_id:\"eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02\" id:\"eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02\" pid:5668 exit_status:1 exited_at:{seconds:1769489476 nanos:708246808}" Jan 27 04:51:16.711000 audit: BPF prog-id=260 op=UNLOAD Jan 27 04:51:16.713529 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 27 04:51:16.713604 kernel: audit: type=1334 audit(1769489476.711:951): prog-id=260 op=UNLOAD Jan 27 04:51:16.713623 kernel: audit: type=1334 audit(1769489476.711:952): prog-id=264 op=UNLOAD Jan 27 04:51:16.711000 audit: BPF prog-id=264 op=UNLOAD Jan 27 04:51:16.730323 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02-rootfs.mount: Deactivated successfully. Jan 27 04:51:17.453112 kubelet[2905]: I0127 04:51:17.453024 2905 scope.go:117] "RemoveContainer" containerID="e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a" Jan 27 04:51:17.453600 kubelet[2905]: I0127 04:51:17.453403 2905 scope.go:117] "RemoveContainer" containerID="eae83d795e4ff01dfcabfd41984aec07c9dc91e0893c789f7cc9b3411becfa02" Jan 27 04:51:17.453600 kubelet[2905]: E0127 04:51:17.453565 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-49b9p_tigera-operator(e38e7754-e013-45ac-9d88-ca7e4c7a3653)\"" pod="tigera-operator/tigera-operator-7dcd859c48-49b9p" podUID="e38e7754-e013-45ac-9d88-ca7e4c7a3653" Jan 27 04:51:17.455105 containerd[1666]: time="2026-01-27T04:51:17.455062298Z" level=info msg="RemoveContainer for \"e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a\"" Jan 27 04:51:17.464440 containerd[1666]: time="2026-01-27T04:51:17.464364906Z" level=info msg="RemoveContainer for \"e79d7aa7482d5f4f0439f0752e715311de5d1bcf24f406c693e7adb99c04535a\" returns successfully" Jan 27 04:51:18.826223 kubelet[2905]: E0127 04:51:18.826160 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-trbp2" podUID="1f1398e6-fd52-4fce-b3fa-2a2bc91ba72b"