Jan 15 00:21:32.431922 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 15 00:21:32.431946 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Wed Jan 14 22:02:18 -00 2026 Jan 15 00:21:32.431957 kernel: KASLR enabled Jan 15 00:21:32.431963 kernel: efi: EFI v2.7 by EDK II Jan 15 00:21:32.431969 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Jan 15 00:21:32.431975 kernel: random: crng init done Jan 15 00:21:32.431982 kernel: secureboot: Secure boot disabled Jan 15 00:21:32.431989 kernel: ACPI: Early table checksum verification disabled Jan 15 00:21:32.431995 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 15 00:21:32.432003 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 15 00:21:32.432010 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432016 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432022 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432028 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432038 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432044 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432134 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432146 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432153 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432160 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 00:21:32.432166 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 15 00:21:32.432186 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 15 00:21:32.432193 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 15 00:21:32.432204 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 15 00:21:32.432210 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 15 00:21:32.432217 kernel: Zone ranges: Jan 15 00:21:32.432224 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 15 00:21:32.432230 kernel: DMA32 empty Jan 15 00:21:32.432237 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 15 00:21:32.432243 kernel: Device empty Jan 15 00:21:32.432250 kernel: Movable zone start for each node Jan 15 00:21:32.432256 kernel: Early memory node ranges Jan 15 00:21:32.432263 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 15 00:21:32.432270 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 15 00:21:32.432276 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 15 00:21:32.432284 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 15 00:21:32.432291 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 15 00:21:32.432297 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 15 00:21:32.432304 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 15 00:21:32.432311 kernel: psci: probing for conduit method from ACPI. Jan 15 00:21:32.432320 kernel: psci: PSCIv1.3 detected in firmware. Jan 15 00:21:32.432329 kernel: psci: Using standard PSCI v0.2 function IDs Jan 15 00:21:32.432336 kernel: psci: Trusted OS migration not required Jan 15 00:21:32.432342 kernel: psci: SMC Calling Convention v1.1 Jan 15 00:21:32.432350 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 15 00:21:32.432356 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 15 00:21:32.432363 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 15 00:21:32.432370 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 15 00:21:32.432377 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 15 00:21:32.432386 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 15 00:21:32.432393 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 15 00:21:32.432400 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 15 00:21:32.432407 kernel: Detected PIPT I-cache on CPU0 Jan 15 00:21:32.432414 kernel: CPU features: detected: GIC system register CPU interface Jan 15 00:21:32.432421 kernel: CPU features: detected: Spectre-v4 Jan 15 00:21:32.432428 kernel: CPU features: detected: Spectre-BHB Jan 15 00:21:32.432435 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 15 00:21:32.432442 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 15 00:21:32.432449 kernel: CPU features: detected: ARM erratum 1418040 Jan 15 00:21:32.432456 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 15 00:21:32.432464 kernel: alternatives: applying boot alternatives Jan 15 00:21:32.432472 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=e4a6d042213df6c386c00b2ef561482ef59cf24ca6770345ce520c577e366e5a Jan 15 00:21:32.432480 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 15 00:21:32.432487 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 15 00:21:32.432494 kernel: Fallback order for Node 0: 0 Jan 15 00:21:32.432501 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 15 00:21:32.432508 kernel: Policy zone: Normal Jan 15 00:21:32.432515 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 15 00:21:32.432521 kernel: software IO TLB: area num 4. Jan 15 00:21:32.432528 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 15 00:21:32.432537 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 15 00:21:32.432544 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 15 00:21:32.432552 kernel: rcu: RCU event tracing is enabled. Jan 15 00:21:32.432559 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 15 00:21:32.432566 kernel: Trampoline variant of Tasks RCU enabled. Jan 15 00:21:32.432573 kernel: Tracing variant of Tasks RCU enabled. Jan 15 00:21:32.432580 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 15 00:21:32.432587 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 15 00:21:32.432594 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 15 00:21:32.432601 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 15 00:21:32.432608 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 15 00:21:32.432616 kernel: GICv3: 256 SPIs implemented Jan 15 00:21:32.432623 kernel: GICv3: 0 Extended SPIs implemented Jan 15 00:21:32.432630 kernel: Root IRQ handler: gic_handle_irq Jan 15 00:21:32.432637 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 15 00:21:32.432644 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 15 00:21:32.432651 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 15 00:21:32.432658 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 15 00:21:32.432665 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 15 00:21:32.432672 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 15 00:21:32.432679 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 15 00:21:32.432686 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 15 00:21:32.432693 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 15 00:21:32.432702 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 15 00:21:32.432710 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 15 00:21:32.432717 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 15 00:21:32.432724 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 15 00:21:32.432731 kernel: arm-pv: using stolen time PV Jan 15 00:21:32.432739 kernel: Console: colour dummy device 80x25 Jan 15 00:21:32.432746 kernel: ACPI: Core revision 20240827 Jan 15 00:21:32.432754 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 15 00:21:32.432763 kernel: pid_max: default: 32768 minimum: 301 Jan 15 00:21:32.432770 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 15 00:21:32.432778 kernel: landlock: Up and running. Jan 15 00:21:32.432785 kernel: SELinux: Initializing. Jan 15 00:21:32.432792 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 15 00:21:32.432800 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 15 00:21:32.432807 kernel: rcu: Hierarchical SRCU implementation. Jan 15 00:21:32.432814 kernel: rcu: Max phase no-delay instances is 400. Jan 15 00:21:32.432823 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 15 00:21:32.432832 kernel: Remapping and enabling EFI services. Jan 15 00:21:32.432839 kernel: smp: Bringing up secondary CPUs ... Jan 15 00:21:32.432846 kernel: Detected PIPT I-cache on CPU1 Jan 15 00:21:32.432853 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 15 00:21:32.432861 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 15 00:21:32.432868 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 15 00:21:32.432877 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 15 00:21:32.432884 kernel: Detected PIPT I-cache on CPU2 Jan 15 00:21:32.432897 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 15 00:21:32.432906 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 15 00:21:32.432913 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 15 00:21:32.432920 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 15 00:21:32.432928 kernel: Detected PIPT I-cache on CPU3 Jan 15 00:21:32.432936 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 15 00:21:32.432945 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 15 00:21:32.432953 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 15 00:21:32.432960 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 15 00:21:32.432968 kernel: smp: Brought up 1 node, 4 CPUs Jan 15 00:21:32.432975 kernel: SMP: Total of 4 processors activated. Jan 15 00:21:32.432983 kernel: CPU: All CPU(s) started at EL1 Jan 15 00:21:32.432992 kernel: CPU features: detected: 32-bit EL0 Support Jan 15 00:21:32.432999 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 15 00:21:32.433007 kernel: CPU features: detected: Common not Private translations Jan 15 00:21:32.433015 kernel: CPU features: detected: CRC32 instructions Jan 15 00:21:32.433022 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 15 00:21:32.433030 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 15 00:21:32.433037 kernel: CPU features: detected: LSE atomic instructions Jan 15 00:21:32.433046 kernel: CPU features: detected: Privileged Access Never Jan 15 00:21:32.433054 kernel: CPU features: detected: RAS Extension Support Jan 15 00:21:32.433061 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 15 00:21:32.433069 kernel: alternatives: applying system-wide alternatives Jan 15 00:21:32.433077 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 15 00:21:32.433085 kernel: Memory: 16324496K/16777216K available (11200K kernel code, 2458K rwdata, 9088K rodata, 12416K init, 1038K bss, 429936K reserved, 16384K cma-reserved) Jan 15 00:21:32.433092 kernel: devtmpfs: initialized Jan 15 00:21:32.433101 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 15 00:21:32.433109 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 15 00:21:32.433117 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 15 00:21:32.433124 kernel: 0 pages in range for non-PLT usage Jan 15 00:21:32.433132 kernel: 515184 pages in range for PLT usage Jan 15 00:21:32.433139 kernel: pinctrl core: initialized pinctrl subsystem Jan 15 00:21:32.433147 kernel: SMBIOS 3.0.0 present. Jan 15 00:21:32.433154 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 15 00:21:32.433163 kernel: DMI: Memory slots populated: 1/1 Jan 15 00:21:32.433235 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 15 00:21:32.433247 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 15 00:21:32.433255 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 15 00:21:32.433263 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 15 00:21:32.433271 kernel: audit: initializing netlink subsys (disabled) Jan 15 00:21:32.433278 kernel: audit: type=2000 audit(0.037:1): state=initialized audit_enabled=0 res=1 Jan 15 00:21:32.433288 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 15 00:21:32.433296 kernel: cpuidle: using governor menu Jan 15 00:21:32.433304 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 15 00:21:32.433311 kernel: ASID allocator initialised with 32768 entries Jan 15 00:21:32.433319 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 15 00:21:32.433326 kernel: Serial: AMBA PL011 UART driver Jan 15 00:21:32.433334 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 15 00:21:32.433344 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 15 00:21:32.433352 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 15 00:21:32.433359 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 15 00:21:32.433367 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 15 00:21:32.433374 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 15 00:21:32.433382 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 15 00:21:32.433389 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 15 00:21:32.433397 kernel: ACPI: Added _OSI(Module Device) Jan 15 00:21:32.433406 kernel: ACPI: Added _OSI(Processor Device) Jan 15 00:21:32.433414 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 15 00:21:32.433421 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 15 00:21:32.433429 kernel: ACPI: Interpreter enabled Jan 15 00:21:32.433436 kernel: ACPI: Using GIC for interrupt routing Jan 15 00:21:32.433444 kernel: ACPI: MCFG table detected, 1 entries Jan 15 00:21:32.433451 kernel: ACPI: CPU0 has been hot-added Jan 15 00:21:32.433460 kernel: ACPI: CPU1 has been hot-added Jan 15 00:21:32.433468 kernel: ACPI: CPU2 has been hot-added Jan 15 00:21:32.433476 kernel: ACPI: CPU3 has been hot-added Jan 15 00:21:32.433483 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 15 00:21:32.433491 kernel: printk: legacy console [ttyAMA0] enabled Jan 15 00:21:32.433504 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 15 00:21:32.433689 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 15 00:21:32.433816 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 15 00:21:32.433909 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 15 00:21:32.434116 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 15 00:21:32.434232 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 15 00:21:32.434245 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 15 00:21:32.434253 kernel: PCI host bridge to bus 0000:00 Jan 15 00:21:32.434354 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 15 00:21:32.434438 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 15 00:21:32.434518 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 15 00:21:32.434690 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 15 00:21:32.434830 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 15 00:21:32.434941 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.435035 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 15 00:21:32.435122 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 15 00:21:32.435228 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 15 00:21:32.435318 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 15 00:21:32.435413 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.435502 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 15 00:21:32.435587 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 15 00:21:32.435672 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 15 00:21:32.435764 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.435857 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 15 00:21:32.435946 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 15 00:21:32.436032 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 15 00:21:32.436117 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 15 00:21:32.436221 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.436309 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 15 00:21:32.436394 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 15 00:21:32.436482 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 15 00:21:32.436575 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.436661 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 15 00:21:32.436748 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 15 00:21:32.436833 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 15 00:21:32.436918 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 15 00:21:32.437012 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.437097 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 15 00:21:32.437192 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 15 00:21:32.437282 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 15 00:21:32.437374 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 15 00:21:32.437468 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.437558 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 15 00:21:32.437643 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 15 00:21:32.437734 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.437819 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 15 00:21:32.437904 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 15 00:21:32.438015 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.438109 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 15 00:21:32.438207 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 15 00:21:32.438304 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.438390 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 15 00:21:32.438479 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 15 00:21:32.438573 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.438659 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 15 00:21:32.438744 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 15 00:21:32.438855 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.438946 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 15 00:21:32.439037 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 15 00:21:32.439229 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.439359 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 15 00:21:32.439449 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 15 00:21:32.439546 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.439632 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 15 00:21:32.439721 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 15 00:21:32.439878 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.439995 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 15 00:21:32.440082 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 15 00:21:32.440205 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.440308 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 15 00:21:32.440395 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 15 00:21:32.440488 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.440574 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 15 00:21:32.440657 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 15 00:21:32.440748 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.440839 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 15 00:21:32.440924 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 15 00:21:32.441009 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 15 00:21:32.441094 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 15 00:21:32.441204 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.441296 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 15 00:21:32.441386 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 15 00:21:32.441471 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 15 00:21:32.441555 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 15 00:21:32.441653 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.441739 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 15 00:21:32.441823 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 15 00:21:32.441910 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 15 00:21:32.441994 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 15 00:21:32.442087 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.442182 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 15 00:21:32.442273 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 15 00:21:32.442359 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 15 00:21:32.442447 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 15 00:21:32.442649 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.442746 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 15 00:21:32.442851 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 15 00:21:32.442944 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 15 00:21:32.443102 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 15 00:21:32.443230 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.443323 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 15 00:21:32.443412 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 15 00:21:32.443501 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 15 00:21:32.443588 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 15 00:21:32.443679 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.443767 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 15 00:21:32.443853 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 15 00:21:32.443940 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 15 00:21:32.444026 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 15 00:21:32.444119 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.444224 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 15 00:21:32.444318 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 15 00:21:32.444406 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 15 00:21:32.444494 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 15 00:21:32.444591 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.444682 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 15 00:21:32.444773 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 15 00:21:32.444867 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 15 00:21:32.444968 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 15 00:21:32.445065 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.445155 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 15 00:21:32.445254 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 15 00:21:32.445343 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 15 00:21:32.445432 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 15 00:21:32.445525 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.445612 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 15 00:21:32.445697 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 15 00:21:32.445783 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 15 00:21:32.445869 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 15 00:21:32.445976 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.446068 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 15 00:21:32.446157 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 15 00:21:32.446287 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 15 00:21:32.446394 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 15 00:21:32.446497 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.446599 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 15 00:21:32.446684 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 15 00:21:32.446828 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 15 00:21:32.446927 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 15 00:21:32.447022 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.447109 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 15 00:21:32.447217 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 15 00:21:32.447308 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 15 00:21:32.447514 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 15 00:21:32.447632 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.447719 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 15 00:21:32.447871 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 15 00:21:32.447969 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 15 00:21:32.448056 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 15 00:21:32.448149 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 00:21:32.448266 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 15 00:21:32.448354 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 15 00:21:32.448448 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 15 00:21:32.448534 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 15 00:21:32.448633 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 15 00:21:32.448723 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 15 00:21:32.448813 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 15 00:21:32.448904 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 15 00:21:32.449000 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 15 00:21:32.449092 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 15 00:21:32.449307 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 15 00:21:32.449433 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 15 00:21:32.449526 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 15 00:21:32.449697 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 15 00:21:32.449801 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 15 00:21:32.449906 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 15 00:21:32.450111 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 15 00:21:32.450258 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 15 00:21:32.450365 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 15 00:21:32.450454 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 15 00:21:32.450543 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 15 00:21:32.450631 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 15 00:21:32.450723 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 15 00:21:32.450827 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 15 00:21:32.450931 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 15 00:21:32.451018 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 15 00:21:32.451107 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 15 00:21:32.451322 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 15 00:21:32.451416 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 15 00:21:32.451504 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 15 00:21:32.451595 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 15 00:21:32.451686 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 15 00:21:32.451772 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 15 00:21:32.451861 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 15 00:21:32.451947 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 15 00:21:32.452102 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 15 00:21:32.452239 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 15 00:21:32.452340 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 15 00:21:32.452427 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 15 00:21:32.452517 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 15 00:21:32.452604 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 15 00:21:32.452689 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 15 00:21:32.452780 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 15 00:21:32.452868 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 15 00:21:32.452953 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 15 00:21:32.453045 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 15 00:21:32.453131 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 15 00:21:32.453234 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 15 00:21:32.453331 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 15 00:21:32.453418 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 15 00:21:32.453505 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 15 00:21:32.453599 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 15 00:21:32.453688 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 15 00:21:32.453774 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 15 00:21:32.453866 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 15 00:21:32.453951 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 15 00:21:32.454035 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 15 00:21:32.454124 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 15 00:21:32.454218 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 15 00:21:32.454305 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 15 00:21:32.454399 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 15 00:21:32.454486 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 15 00:21:32.454571 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 15 00:21:32.454660 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 15 00:21:32.454746 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 15 00:21:32.454848 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 15 00:21:32.454941 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 15 00:21:32.455027 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 15 00:21:32.455112 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 15 00:21:32.455210 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 15 00:21:32.455297 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 15 00:21:32.455385 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 15 00:21:32.455475 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 15 00:21:32.455562 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 15 00:21:32.455648 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 15 00:21:32.455737 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 15 00:21:32.455826 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 15 00:21:32.455911 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 15 00:21:32.456002 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 15 00:21:32.456088 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 15 00:21:32.456180 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 15 00:21:32.456293 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 15 00:21:32.456385 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 15 00:21:32.456472 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 15 00:21:32.456561 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 15 00:21:32.456647 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 15 00:21:32.456733 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 15 00:21:32.456823 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 15 00:21:32.456912 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 15 00:21:32.456997 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 15 00:21:32.457086 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 15 00:21:32.457184 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 15 00:21:32.457283 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 15 00:21:32.457379 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 15 00:21:32.457467 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 15 00:21:32.457554 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 15 00:21:32.457645 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 15 00:21:32.457732 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 15 00:21:32.457817 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 15 00:21:32.457907 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 15 00:21:32.457994 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 15 00:21:32.458079 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 15 00:21:32.458168 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 15 00:21:32.458285 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 15 00:21:32.458376 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 15 00:21:32.458466 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 15 00:21:32.458567 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 15 00:21:32.458661 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 15 00:21:32.458757 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 15 00:21:32.458861 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 15 00:21:32.458955 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 15 00:21:32.459043 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 15 00:21:32.459130 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 15 00:21:32.459229 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 15 00:21:32.459319 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 15 00:21:32.459406 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 15 00:21:32.459493 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 15 00:21:32.459593 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 15 00:21:32.459682 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 15 00:21:32.459768 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 15 00:21:32.459855 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 15 00:21:32.459944 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 15 00:21:32.460031 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 15 00:21:32.460255 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 15 00:21:32.460359 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 15 00:21:32.460449 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 15 00:21:32.460539 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 15 00:21:32.460626 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 15 00:21:32.460719 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 15 00:21:32.460805 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 15 00:21:32.460893 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 15 00:21:32.460979 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 15 00:21:32.461140 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 15 00:21:32.461274 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 15 00:21:32.461374 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 15 00:21:32.461465 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 15 00:21:32.461553 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 15 00:21:32.461638 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 15 00:21:32.461726 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 15 00:21:32.461816 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 15 00:21:32.461905 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 15 00:21:32.461991 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 15 00:21:32.462079 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 15 00:21:32.462164 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 15 00:21:32.462271 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 15 00:21:32.462358 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 15 00:21:32.462446 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 15 00:21:32.462545 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 15 00:21:32.462658 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 15 00:21:32.462748 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 15 00:21:32.462857 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 15 00:21:32.462948 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 15 00:21:32.463036 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 15 00:21:32.463126 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 15 00:21:32.463231 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 15 00:21:32.463320 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 15 00:21:32.463409 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 15 00:21:32.463495 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 15 00:21:32.463584 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 15 00:21:32.463675 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 15 00:21:32.463764 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 15 00:21:32.463850 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 15 00:21:32.463939 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 15 00:21:32.464025 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 15 00:21:32.464114 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 15 00:21:32.464209 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 15 00:21:32.464311 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 15 00:21:32.464399 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 15 00:21:32.464489 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 15 00:21:32.464578 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 15 00:21:32.464669 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 15 00:21:32.464757 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 15 00:21:32.464848 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 15 00:21:32.464934 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 15 00:21:32.465027 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 15 00:21:32.465113 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 15 00:21:32.465214 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 15 00:21:32.465307 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 15 00:21:32.465401 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 15 00:21:32.465489 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 15 00:21:32.465579 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 15 00:21:32.465666 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 15 00:21:32.465766 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 15 00:21:32.465855 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 15 00:21:32.465945 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 15 00:21:32.466036 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 15 00:21:32.466125 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 15 00:21:32.466228 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 15 00:21:32.466320 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 15 00:21:32.466407 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 15 00:21:32.466493 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 15 00:21:32.466581 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 15 00:21:32.466670 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 15 00:21:32.466758 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 15 00:21:32.466870 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 15 00:21:32.466962 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 15 00:21:32.467084 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 15 00:21:32.467186 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 15 00:21:32.467289 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 15 00:21:32.467377 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 15 00:21:32.467473 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 15 00:21:32.467567 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 15 00:21:32.467662 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 15 00:21:32.467749 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 15 00:21:32.467847 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 15 00:21:32.467932 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 15 00:21:32.468036 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 15 00:21:32.468136 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 15 00:21:32.468240 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 15 00:21:32.468329 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 15 00:21:32.468417 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 15 00:21:32.468508 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 15 00:21:32.468595 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 15 00:21:32.468701 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 15 00:21:32.468788 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 15 00:21:32.468878 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 15 00:21:32.468967 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 15 00:21:32.469053 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.469151 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.469256 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 15 00:21:32.469345 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.469435 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.469522 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 15 00:21:32.469610 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.469698 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.469787 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 15 00:21:32.469874 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.469962 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.470050 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 15 00:21:32.470137 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.470235 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.470326 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 15 00:21:32.470416 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.470504 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.470603 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 15 00:21:32.470697 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.470783 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.470890 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 15 00:21:32.470979 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.471066 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.471158 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 15 00:21:32.471274 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.471363 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.471451 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 15 00:21:32.471537 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.471621 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.471715 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 15 00:21:32.471810 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.471896 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.471998 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 15 00:21:32.472092 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.472196 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.472291 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 15 00:21:32.472381 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.472466 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.472553 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 15 00:21:32.472637 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.472729 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.472824 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 15 00:21:32.472924 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.473012 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.473102 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 15 00:21:32.473208 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.473308 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.473404 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 15 00:21:32.473492 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.473581 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.473668 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 15 00:21:32.473754 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.473852 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.473945 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 15 00:21:32.474034 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 15 00:21:32.474123 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 15 00:21:32.474223 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 15 00:21:32.474313 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 15 00:21:32.474403 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 15 00:21:32.474505 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 15 00:21:32.474602 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 15 00:21:32.474691 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 15 00:21:32.474787 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 15 00:21:32.474896 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 15 00:21:32.474985 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 15 00:21:32.475088 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 15 00:21:32.475207 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 15 00:21:32.475303 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 15 00:21:32.475408 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.475504 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.475611 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.475701 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.475790 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.475876 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.475964 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.476051 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.476150 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.476258 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.476365 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.476459 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.476574 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.476672 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.476763 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.476852 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.476940 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.477026 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.477117 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.477225 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.477318 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.477414 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.477519 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.477614 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.477720 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.477810 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.477898 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.477987 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.478083 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.478186 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.478311 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.478409 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.478509 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.478606 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.478695 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 00:21:32.478780 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 15 00:21:32.478899 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 15 00:21:32.478991 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 15 00:21:32.479082 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 15 00:21:32.479189 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 15 00:21:32.479284 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 15 00:21:32.479371 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 15 00:21:32.479481 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 15 00:21:32.479574 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 15 00:21:32.479665 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 15 00:21:32.479758 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 15 00:21:32.479851 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 15 00:21:32.479940 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 15 00:21:32.480026 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 15 00:21:32.480112 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 15 00:21:32.480232 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 15 00:21:32.480331 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 15 00:21:32.480417 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 15 00:21:32.480503 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 15 00:21:32.480590 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 15 00:21:32.480693 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 15 00:21:32.480787 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 15 00:21:32.480883 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 15 00:21:32.480971 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 15 00:21:32.481056 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 15 00:21:32.481153 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 15 00:21:32.481261 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 15 00:21:32.481365 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 15 00:21:32.481454 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 15 00:21:32.481540 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 15 00:21:32.481625 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 15 00:21:32.481711 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 15 00:21:32.481796 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 15 00:21:32.481883 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 15 00:21:32.481969 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 15 00:21:32.482055 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 15 00:21:32.482142 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 15 00:21:32.482244 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 15 00:21:32.482333 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 15 00:21:32.482432 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 15 00:21:32.482524 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 15 00:21:32.482620 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 15 00:21:32.482711 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 15 00:21:32.482843 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 15 00:21:32.483062 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 15 00:21:32.483194 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 15 00:21:32.483361 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 15 00:21:32.483482 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 15 00:21:32.483595 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 15 00:21:32.483687 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 15 00:21:32.483781 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 15 00:21:32.483875 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 15 00:21:32.483976 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 15 00:21:32.484080 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 15 00:21:32.484196 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 15 00:21:32.484303 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 15 00:21:32.484398 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 15 00:21:32.484508 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 15 00:21:32.484597 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 15 00:21:32.484695 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 15 00:21:32.484788 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 15 00:21:32.484874 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 15 00:21:32.484967 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 15 00:21:32.485056 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 15 00:21:32.485148 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 15 00:21:32.485255 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 15 00:21:32.485346 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 15 00:21:32.485437 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 15 00:21:32.485524 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 15 00:21:32.485618 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 15 00:21:32.485716 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 15 00:21:32.485805 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 15 00:21:32.485893 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 15 00:21:32.485984 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 15 00:21:32.486074 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 15 00:21:32.486160 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 15 00:21:32.486284 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 15 00:21:32.486379 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 15 00:21:32.486479 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 15 00:21:32.486576 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 15 00:21:32.486665 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 15 00:21:32.486753 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 15 00:21:32.486864 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 15 00:21:32.486954 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 15 00:21:32.487048 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 15 00:21:32.487140 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 15 00:21:32.487305 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 15 00:21:32.487408 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 15 00:21:32.487500 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 15 00:21:32.487589 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 15 00:21:32.487680 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 15 00:21:32.487771 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 15 00:21:32.487865 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 15 00:21:32.487964 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 15 00:21:32.488058 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 15 00:21:32.488146 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 15 00:21:32.488251 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 15 00:21:32.488342 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 15 00:21:32.488433 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 15 00:21:32.488535 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 15 00:21:32.488632 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 15 00:21:32.488723 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 15 00:21:32.488817 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 15 00:21:32.488919 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 15 00:21:32.489034 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 15 00:21:32.489125 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 15 00:21:32.489233 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 15 00:21:32.489322 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 15 00:21:32.489408 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 15 00:21:32.489507 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 15 00:21:32.489603 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 15 00:21:32.489692 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 15 00:21:32.489780 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 15 00:21:32.489878 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 15 00:21:32.489969 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 15 00:21:32.490065 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 15 00:21:32.490157 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 15 00:21:32.490262 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 15 00:21:32.490358 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 15 00:21:32.490460 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 15 00:21:32.490561 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 15 00:21:32.490660 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 15 00:21:32.490754 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 15 00:21:32.490868 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 15 00:21:32.490957 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 15 00:21:32.491043 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 15 00:21:32.491148 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 15 00:21:32.491257 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 15 00:21:32.491352 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 15 00:21:32.491446 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 15 00:21:32.491535 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 15 00:21:32.491635 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 15 00:21:32.491723 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 15 00:21:32.491820 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 15 00:21:32.491910 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 15 00:21:32.492000 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 15 00:21:32.492090 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 15 00:21:32.492202 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 15 00:21:32.492290 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 15 00:21:32.492388 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 15 00:21:32.492473 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 15 00:21:32.492572 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 15 00:21:32.492666 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 15 00:21:32.492765 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 15 00:21:32.492860 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 15 00:21:32.492957 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 15 00:21:32.493042 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 15 00:21:32.493132 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 15 00:21:32.493227 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 15 00:21:32.493320 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 15 00:21:32.493404 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 15 00:21:32.493495 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 15 00:21:32.493579 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 15 00:21:32.493671 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 15 00:21:32.493761 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 15 00:21:32.493854 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 15 00:21:32.493948 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 15 00:21:32.494045 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 15 00:21:32.494138 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 15 00:21:32.494261 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 15 00:21:32.494363 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 15 00:21:32.494454 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 15 00:21:32.494536 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 15 00:21:32.494638 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 15 00:21:32.494721 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 15 00:21:32.494825 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 15 00:21:32.494913 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 15 00:21:32.494998 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 15 00:21:32.495086 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 15 00:21:32.495168 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 15 00:21:32.495274 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 15 00:21:32.495365 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 15 00:21:32.495447 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 15 00:21:32.495531 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 15 00:21:32.495623 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 15 00:21:32.495706 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 15 00:21:32.495787 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 15 00:21:32.495878 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 15 00:21:32.495964 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 15 00:21:32.496061 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 15 00:21:32.496155 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 15 00:21:32.496253 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 15 00:21:32.496339 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 15 00:21:32.496429 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 15 00:21:32.496512 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 15 00:21:32.496602 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 15 00:21:32.496695 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 15 00:21:32.496777 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 15 00:21:32.496860 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 15 00:21:32.496957 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 15 00:21:32.497040 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 15 00:21:32.497120 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 15 00:21:32.497227 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 15 00:21:32.497316 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 15 00:21:32.497407 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 15 00:21:32.497502 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 15 00:21:32.497583 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 15 00:21:32.497672 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 15 00:21:32.497763 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 15 00:21:32.497858 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 15 00:21:32.497940 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 15 00:21:32.498041 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 15 00:21:32.498125 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 15 00:21:32.498232 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 15 00:21:32.498327 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 15 00:21:32.498411 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 15 00:21:32.498493 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 15 00:21:32.498601 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 15 00:21:32.498685 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 15 00:21:32.498774 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 15 00:21:32.498786 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 15 00:21:32.498805 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 15 00:21:32.498814 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 15 00:21:32.498825 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 15 00:21:32.498834 kernel: iommu: Default domain type: Translated Jan 15 00:21:32.498842 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 15 00:21:32.498850 kernel: efivars: Registered efivars operations Jan 15 00:21:32.498858 kernel: vgaarb: loaded Jan 15 00:21:32.498866 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 15 00:21:32.498874 kernel: VFS: Disk quotas dquot_6.6.0 Jan 15 00:21:32.498884 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 15 00:21:32.498892 kernel: pnp: PnP ACPI init Jan 15 00:21:32.498996 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 15 00:21:32.499008 kernel: pnp: PnP ACPI: found 1 devices Jan 15 00:21:32.499016 kernel: NET: Registered PF_INET protocol family Jan 15 00:21:32.499024 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 15 00:21:32.499033 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 15 00:21:32.499043 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 15 00:21:32.499052 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 15 00:21:32.499060 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 15 00:21:32.499068 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 15 00:21:32.499077 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 15 00:21:32.499085 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 15 00:21:32.499093 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 15 00:21:32.499220 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 15 00:21:32.499234 kernel: PCI: CLS 0 bytes, default 64 Jan 15 00:21:32.499242 kernel: kvm [1]: HYP mode not available Jan 15 00:21:32.499250 kernel: Initialise system trusted keyrings Jan 15 00:21:32.499258 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 15 00:21:32.499267 kernel: Key type asymmetric registered Jan 15 00:21:32.499275 kernel: Asymmetric key parser 'x509' registered Jan 15 00:21:32.499286 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 15 00:21:32.499294 kernel: io scheduler mq-deadline registered Jan 15 00:21:32.499302 kernel: io scheduler kyber registered Jan 15 00:21:32.499310 kernel: io scheduler bfq registered Jan 15 00:21:32.499319 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 15 00:21:32.499413 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 15 00:21:32.499513 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 15 00:21:32.499604 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.499693 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 15 00:21:32.499778 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 15 00:21:32.499863 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.499952 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 15 00:21:32.500038 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 15 00:21:32.500126 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.500230 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 15 00:21:32.500327 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 15 00:21:32.500422 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.500523 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 15 00:21:32.500619 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 15 00:21:32.500710 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.500799 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 15 00:21:32.500888 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 15 00:21:32.500978 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.501066 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 15 00:21:32.501152 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 15 00:21:32.501259 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.501355 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 15 00:21:32.501443 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 15 00:21:32.501531 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.501547 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 15 00:21:32.501648 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 15 00:21:32.501750 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 15 00:21:32.501843 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.501941 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 15 00:21:32.502038 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 15 00:21:32.502125 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.502233 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 15 00:21:32.502323 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 15 00:21:32.502412 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.502502 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 15 00:21:32.502604 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 15 00:21:32.502695 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.502801 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 15 00:21:32.502902 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 15 00:21:32.502994 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.503094 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 15 00:21:32.503197 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 15 00:21:32.503291 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.503382 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 15 00:21:32.503468 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 15 00:21:32.503552 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.503656 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 15 00:21:32.503746 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 15 00:21:32.503842 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.503854 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 15 00:21:32.503950 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 15 00:21:32.504041 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 15 00:21:32.504130 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.504241 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 15 00:21:32.504330 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 15 00:21:32.504416 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.504504 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 15 00:21:32.504590 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 15 00:21:32.504676 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.504768 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 15 00:21:32.504854 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 15 00:21:32.504938 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.505026 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 15 00:21:32.505111 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 15 00:21:32.505215 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.505311 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 15 00:21:32.505397 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 15 00:21:32.505482 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.505577 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 15 00:21:32.505670 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 15 00:21:32.505756 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.505854 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 15 00:21:32.505943 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 15 00:21:32.506029 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.506041 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 15 00:21:32.506127 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 15 00:21:32.506234 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 15 00:21:32.506323 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.506425 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 15 00:21:32.506512 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 15 00:21:32.506609 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.506699 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 15 00:21:32.506807 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 15 00:21:32.506907 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.507001 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 15 00:21:32.507089 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 15 00:21:32.507194 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.507290 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 15 00:21:32.507379 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 15 00:21:32.507472 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.507574 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 15 00:21:32.507664 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 15 00:21:32.507751 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.507840 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 15 00:21:32.507928 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 15 00:21:32.508013 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.508103 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 15 00:21:32.508205 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 15 00:21:32.508308 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.508404 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 15 00:21:32.508509 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 15 00:21:32.508598 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 15 00:21:32.508610 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 15 00:21:32.508621 kernel: ACPI: button: Power Button [PWRB] Jan 15 00:21:32.508712 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 15 00:21:32.508805 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 15 00:21:32.508816 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 15 00:21:32.508824 kernel: thunder_xcv, ver 1.0 Jan 15 00:21:32.508832 kernel: thunder_bgx, ver 1.0 Jan 15 00:21:32.508840 kernel: nicpf, ver 1.0 Jan 15 00:21:32.508850 kernel: nicvf, ver 1.0 Jan 15 00:21:32.508947 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 15 00:21:32.509037 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-15T00:21:31 UTC (1768436491) Jan 15 00:21:32.509049 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 15 00:21:32.509058 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 15 00:21:32.509066 kernel: watchdog: NMI not fully supported Jan 15 00:21:32.509076 kernel: watchdog: Hard watchdog permanently disabled Jan 15 00:21:32.509084 kernel: NET: Registered PF_INET6 protocol family Jan 15 00:21:32.509092 kernel: Segment Routing with IPv6 Jan 15 00:21:32.509100 kernel: In-situ OAM (IOAM) with IPv6 Jan 15 00:21:32.509109 kernel: NET: Registered PF_PACKET protocol family Jan 15 00:21:32.509117 kernel: Key type dns_resolver registered Jan 15 00:21:32.509125 kernel: registered taskstats version 1 Jan 15 00:21:32.509133 kernel: Loading compiled-in X.509 certificates Jan 15 00:21:32.509144 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: a690a20944211e11dad41e677dd7158a4ddc3c87' Jan 15 00:21:32.509152 kernel: Demotion targets for Node 0: null Jan 15 00:21:32.509160 kernel: Key type .fscrypt registered Jan 15 00:21:32.509168 kernel: Key type fscrypt-provisioning registered Jan 15 00:21:32.509188 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 15 00:21:32.509197 kernel: ima: Allocated hash algorithm: sha1 Jan 15 00:21:32.509205 kernel: ima: No architecture policies found Jan 15 00:21:32.509215 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 15 00:21:32.509223 kernel: clk: Disabling unused clocks Jan 15 00:21:32.509231 kernel: PM: genpd: Disabling unused power domains Jan 15 00:21:32.509240 kernel: Freeing unused kernel memory: 12416K Jan 15 00:21:32.509248 kernel: Run /init as init process Jan 15 00:21:32.509256 kernel: with arguments: Jan 15 00:21:32.509264 kernel: /init Jan 15 00:21:32.509274 kernel: with environment: Jan 15 00:21:32.509282 kernel: HOME=/ Jan 15 00:21:32.509290 kernel: TERM=linux Jan 15 00:21:32.509297 kernel: ACPI: bus type USB registered Jan 15 00:21:32.509306 kernel: usbcore: registered new interface driver usbfs Jan 15 00:21:32.509314 kernel: usbcore: registered new interface driver hub Jan 15 00:21:32.509322 kernel: usbcore: registered new device driver usb Jan 15 00:21:32.509425 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 15 00:21:32.509525 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 15 00:21:32.509635 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 15 00:21:32.509728 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 15 00:21:32.509816 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 15 00:21:32.509904 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 15 00:21:32.510023 kernel: hub 1-0:1.0: USB hub found Jan 15 00:21:32.510161 kernel: hub 1-0:1.0: 4 ports detected Jan 15 00:21:32.510297 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 15 00:21:32.510413 kernel: hub 2-0:1.0: USB hub found Jan 15 00:21:32.510507 kernel: hub 2-0:1.0: 4 ports detected Jan 15 00:21:32.510625 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 15 00:21:32.510727 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 15 00:21:32.510738 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 15 00:21:32.510748 kernel: GPT:25804799 != 104857599 Jan 15 00:21:32.510756 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 15 00:21:32.510764 kernel: GPT:25804799 != 104857599 Jan 15 00:21:32.510773 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 15 00:21:32.510783 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 00:21:32.510803 kernel: SCSI subsystem initialized Jan 15 00:21:32.510814 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 15 00:21:32.510822 kernel: device-mapper: uevent: version 1.0.3 Jan 15 00:21:32.510831 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 15 00:21:32.510840 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 15 00:21:32.510850 kernel: raid6: neonx8 gen() 15720 MB/s Jan 15 00:21:32.510860 kernel: raid6: neonx4 gen() 15448 MB/s Jan 15 00:21:32.510868 kernel: raid6: neonx2 gen() 12331 MB/s Jan 15 00:21:32.510877 kernel: raid6: neonx1 gen() 10237 MB/s Jan 15 00:21:32.510885 kernel: raid6: int64x8 gen() 6752 MB/s Jan 15 00:21:32.510894 kernel: raid6: int64x4 gen() 7283 MB/s Jan 15 00:21:32.510902 kernel: raid6: int64x2 gen() 6057 MB/s Jan 15 00:21:32.510910 kernel: raid6: int64x1 gen() 5008 MB/s Jan 15 00:21:32.510920 kernel: raid6: using algorithm neonx8 gen() 15720 MB/s Jan 15 00:21:32.510929 kernel: raid6: .... xor() 11919 MB/s, rmw enabled Jan 15 00:21:32.510937 kernel: raid6: using neon recovery algorithm Jan 15 00:21:32.510946 kernel: xor: measuring software checksum speed Jan 15 00:21:32.510957 kernel: 8regs : 21596 MB/sec Jan 15 00:21:32.510965 kernel: 32regs : 21710 MB/sec Jan 15 00:21:32.510975 kernel: arm64_neon : 28070 MB/sec Jan 15 00:21:32.510984 kernel: xor: using function: arm64_neon (28070 MB/sec) Jan 15 00:21:32.511116 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 15 00:21:32.511131 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 15 00:21:32.511140 kernel: BTRFS: device fsid 78d59ed4-d19c-4fcc-8998-5f0c19b42daf devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (276) Jan 15 00:21:32.511149 kernel: BTRFS info (device dm-0): first mount of filesystem 78d59ed4-d19c-4fcc-8998-5f0c19b42daf Jan 15 00:21:32.511158 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 15 00:21:32.511169 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 15 00:21:32.511196 kernel: BTRFS info (device dm-0): enabling free space tree Jan 15 00:21:32.511206 kernel: loop: module loaded Jan 15 00:21:32.511214 kernel: loop0: detected capacity change from 0 to 91488 Jan 15 00:21:32.511223 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 15 00:21:32.511344 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 15 00:21:32.511361 systemd[1]: Successfully made /usr/ read-only. Jan 15 00:21:32.511372 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 15 00:21:32.511382 systemd[1]: Detected virtualization kvm. Jan 15 00:21:32.511390 systemd[1]: Detected architecture arm64. Jan 15 00:21:32.511399 systemd[1]: Running in initrd. Jan 15 00:21:32.511408 systemd[1]: No hostname configured, using default hostname. Jan 15 00:21:32.511418 systemd[1]: Hostname set to . Jan 15 00:21:32.511428 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 15 00:21:32.511437 systemd[1]: Queued start job for default target initrd.target. Jan 15 00:21:32.511445 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 15 00:21:32.511454 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 00:21:32.511463 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 00:21:32.511475 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 15 00:21:32.511484 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 00:21:32.511493 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 15 00:21:32.511502 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 15 00:21:32.511511 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 00:21:32.511520 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 00:21:32.511531 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 15 00:21:32.511540 systemd[1]: Reached target paths.target - Path Units. Jan 15 00:21:32.511549 systemd[1]: Reached target slices.target - Slice Units. Jan 15 00:21:32.511558 systemd[1]: Reached target swap.target - Swaps. Jan 15 00:21:32.511567 systemd[1]: Reached target timers.target - Timer Units. Jan 15 00:21:32.511576 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 00:21:32.511585 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 00:21:32.511595 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 15 00:21:32.511604 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 15 00:21:32.511613 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 15 00:21:32.511622 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 00:21:32.511631 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 00:21:32.511640 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 00:21:32.511650 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 00:21:32.511660 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 15 00:21:32.511669 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 15 00:21:32.511678 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 00:21:32.511687 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 15 00:21:32.511696 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 15 00:21:32.511705 systemd[1]: Starting systemd-fsck-usr.service... Jan 15 00:21:32.511716 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 00:21:32.511725 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 00:21:32.511734 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 00:21:32.511743 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 15 00:21:32.511776 systemd-journald[417]: Collecting audit messages is enabled. Jan 15 00:21:32.511798 kernel: audit: type=1130 audit(1768436492.430:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.511810 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 00:21:32.511819 kernel: audit: type=1130 audit(1768436492.434:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.511828 systemd[1]: Finished systemd-fsck-usr.service. Jan 15 00:21:32.511838 kernel: audit: type=1130 audit(1768436492.438:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.511847 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 15 00:21:32.511856 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 15 00:21:32.511864 kernel: Bridge firewalling registered Jan 15 00:21:32.511875 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 00:21:32.511884 kernel: audit: type=1130 audit(1768436492.459:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.511893 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 00:21:32.511902 kernel: audit: type=1130 audit(1768436492.464:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.511911 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 15 00:21:32.511921 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 00:21:32.511932 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 00:21:32.511942 kernel: audit: type=1130 audit(1768436492.479:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.511951 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 00:21:32.511962 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 00:21:32.511971 kernel: audit: type=1130 audit(1768436492.496:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.511980 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 00:21:32.511991 kernel: audit: type=1130 audit(1768436492.501:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.512001 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 00:21:32.512010 kernel: audit: type=1130 audit(1768436492.505:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.512019 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 15 00:21:32.512028 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 00:21:32.512038 systemd-journald[417]: Journal started Jan 15 00:21:32.512059 systemd-journald[417]: Runtime Journal (/run/log/journal/793873be56a44770b8aaf5b0fa3487c7) is 8M, max 319.5M, 311.5M free. Jan 15 00:21:32.430000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.434000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.438000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.464000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.479000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.496000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.501000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.505000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.510000 audit: BPF prog-id=6 op=LOAD Jan 15 00:21:32.450845 systemd-modules-load[422]: Inserted module 'br_netfilter' Jan 15 00:21:32.520269 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 00:21:32.520000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.524963 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 00:21:32.532096 dracut-cmdline[447]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=e4a6d042213df6c386c00b2ef561482ef59cf24ca6770345ce520c577e366e5a Jan 15 00:21:32.535732 systemd-tmpfiles[462]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 15 00:21:32.539543 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 00:21:32.542000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.560001 systemd-resolved[448]: Positive Trust Anchors: Jan 15 00:21:32.560025 systemd-resolved[448]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 00:21:32.560028 systemd-resolved[448]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 15 00:21:32.560060 systemd-resolved[448]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 00:21:32.586692 systemd-resolved[448]: Defaulting to hostname 'linux'. Jan 15 00:21:32.588000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.587725 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 00:21:32.588842 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 00:21:32.623212 kernel: Loading iSCSI transport class v2.0-870. Jan 15 00:21:32.634199 kernel: iscsi: registered transport (tcp) Jan 15 00:21:32.648291 kernel: iscsi: registered transport (qla4xxx) Jan 15 00:21:32.648327 kernel: QLogic iSCSI HBA Driver Jan 15 00:21:32.671529 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 15 00:21:32.689884 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 00:21:32.692000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.693661 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 15 00:21:32.742653 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 15 00:21:32.743000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.745113 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 15 00:21:32.746767 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 15 00:21:32.785906 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 15 00:21:32.786000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.787000 audit: BPF prog-id=7 op=LOAD Jan 15 00:21:32.787000 audit: BPF prog-id=8 op=LOAD Jan 15 00:21:32.788505 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 00:21:32.820530 systemd-udevd[695]: Using default interface naming scheme 'v257'. Jan 15 00:21:32.828665 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 00:21:32.829000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.832345 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 15 00:21:32.856529 dracut-pre-trigger[767]: rd.md=0: removing MD RAID activation Jan 15 00:21:32.860355 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 00:21:32.861000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.863946 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 00:21:32.863000 audit: BPF prog-id=9 op=LOAD Jan 15 00:21:32.883301 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 00:21:32.884000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.885705 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 00:21:32.911898 systemd-networkd[813]: lo: Link UP Jan 15 00:21:32.911907 systemd-networkd[813]: lo: Gained carrier Jan 15 00:21:32.913000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.912525 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 00:21:32.914021 systemd[1]: Reached target network.target - Network. Jan 15 00:21:32.977562 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 00:21:32.979000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:32.984316 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 15 00:21:33.044549 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 15 00:21:33.062182 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 15 00:21:33.069834 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 15 00:21:33.079719 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 00:21:33.083403 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 15 00:21:33.083243 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 15 00:21:33.092325 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 15 00:21:33.093225 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 15 00:21:33.105783 disk-uuid[874]: Primary Header is updated. Jan 15 00:21:33.105783 disk-uuid[874]: Secondary Entries is updated. Jan 15 00:21:33.105783 disk-uuid[874]: Secondary Header is updated. Jan 15 00:21:33.123514 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 00:21:33.124840 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 00:21:33.126254 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 00:21:33.125000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:33.132240 systemd-networkd[813]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 00:21:33.132248 systemd-networkd[813]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 00:21:33.133412 systemd-networkd[813]: eth0: Link UP Jan 15 00:21:33.133588 systemd-networkd[813]: eth0: Gained carrier Jan 15 00:21:33.133600 systemd-networkd[813]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 00:21:33.150255 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 15 00:21:33.150474 kernel: usbcore: registered new interface driver usbhid Jan 15 00:21:33.150488 kernel: usbhid: USB HID core driver Jan 15 00:21:33.135795 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 00:21:33.175303 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 00:21:33.179000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:33.211905 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 15 00:21:33.213234 systemd-networkd[813]: eth0: DHCPv4 address 10.0.3.29/25, gateway 10.0.3.1 acquired from 10.0.3.1 Jan 15 00:21:33.212000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:33.213547 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 00:21:33.215106 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 00:21:33.216258 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 00:21:33.219225 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 15 00:21:33.246925 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 15 00:21:33.247000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:34.144153 disk-uuid[875]: Warning: The kernel is still using the old partition table. Jan 15 00:21:34.144153 disk-uuid[875]: The new table will be used at the next reboot or after you Jan 15 00:21:34.144153 disk-uuid[875]: run partprobe(8) or kpartx(8) Jan 15 00:21:34.144153 disk-uuid[875]: The operation has completed successfully. Jan 15 00:21:34.149509 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 15 00:21:34.150000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:34.150000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:34.149615 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 15 00:21:34.151774 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 15 00:21:34.190213 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (910) Jan 15 00:21:34.190268 kernel: BTRFS info (device vda6): first mount of filesystem 0eb28982-35f7-4b76-8133-b752f60f3941 Jan 15 00:21:34.192239 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 15 00:21:34.196782 kernel: BTRFS info (device vda6): turning on async discard Jan 15 00:21:34.196830 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 00:21:34.203200 kernel: BTRFS info (device vda6): last unmount of filesystem 0eb28982-35f7-4b76-8133-b752f60f3941 Jan 15 00:21:34.203654 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 15 00:21:34.204000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:34.207680 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 15 00:21:34.359678 ignition[929]: Ignition 2.22.0 Jan 15 00:21:34.359697 ignition[929]: Stage: fetch-offline Jan 15 00:21:34.359740 ignition[929]: no configs at "/usr/lib/ignition/base.d" Jan 15 00:21:34.359749 ignition[929]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 00:21:34.359925 ignition[929]: parsed url from cmdline: "" Jan 15 00:21:34.362695 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 00:21:34.363000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:34.359928 ignition[929]: no config URL provided Jan 15 00:21:34.365067 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 15 00:21:34.359933 ignition[929]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 00:21:34.359941 ignition[929]: no config at "/usr/lib/ignition/user.ign" Jan 15 00:21:34.359946 ignition[929]: failed to fetch config: resource requires networking Jan 15 00:21:34.360217 ignition[929]: Ignition finished successfully Jan 15 00:21:34.392613 ignition[941]: Ignition 2.22.0 Jan 15 00:21:34.392636 ignition[941]: Stage: fetch Jan 15 00:21:34.392793 ignition[941]: no configs at "/usr/lib/ignition/base.d" Jan 15 00:21:34.392802 ignition[941]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 00:21:34.392879 ignition[941]: parsed url from cmdline: "" Jan 15 00:21:34.392882 ignition[941]: no config URL provided Jan 15 00:21:34.392887 ignition[941]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 00:21:34.392893 ignition[941]: no config at "/usr/lib/ignition/user.ign" Jan 15 00:21:34.393165 ignition[941]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 15 00:21:34.393333 ignition[941]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 15 00:21:34.393502 ignition[941]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 15 00:21:35.052258 ignition[941]: GET result: OK Jan 15 00:21:35.052562 ignition[941]: parsing config with SHA512: fccd7a103184e9c127b51e0c25f1bfbc3adbc9f4a91a7ef821f99789d9825a0ac2367a4fd8751472574a6a989071ec7eb9fd7a2ac047a64f654a6b667c52f379 Jan 15 00:21:35.058392 unknown[941]: fetched base config from "system" Jan 15 00:21:35.058404 unknown[941]: fetched base config from "system" Jan 15 00:21:35.058743 ignition[941]: fetch: fetch complete Jan 15 00:21:35.058410 unknown[941]: fetched user config from "openstack" Jan 15 00:21:35.058747 ignition[941]: fetch: fetch passed Jan 15 00:21:35.058802 ignition[941]: Ignition finished successfully Jan 15 00:21:35.063233 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 15 00:21:35.068515 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 15 00:21:35.068544 kernel: audit: type=1130 audit(1768436495.064:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.064000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.065776 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 15 00:21:35.096237 ignition[949]: Ignition 2.22.0 Jan 15 00:21:35.096252 ignition[949]: Stage: kargs Jan 15 00:21:35.096390 ignition[949]: no configs at "/usr/lib/ignition/base.d" Jan 15 00:21:35.096398 ignition[949]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 00:21:35.097158 ignition[949]: kargs: kargs passed Jan 15 00:21:35.097239 ignition[949]: Ignition finished successfully Jan 15 00:21:35.104267 kernel: audit: type=1130 audit(1768436495.100:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.100000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.100012 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 15 00:21:35.102059 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 15 00:21:35.127857 ignition[957]: Ignition 2.22.0 Jan 15 00:21:35.127879 ignition[957]: Stage: disks Jan 15 00:21:35.128033 ignition[957]: no configs at "/usr/lib/ignition/base.d" Jan 15 00:21:35.128041 ignition[957]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 00:21:35.128809 ignition[957]: disks: disks passed Jan 15 00:21:35.132000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.131291 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 15 00:21:35.137719 kernel: audit: type=1130 audit(1768436495.132:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.128860 ignition[957]: Ignition finished successfully Jan 15 00:21:35.133317 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 15 00:21:35.136928 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 15 00:21:35.138759 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 00:21:35.140291 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 00:21:35.142068 systemd[1]: Reached target basic.target - Basic System. Jan 15 00:21:35.144971 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 15 00:21:35.185537 systemd-networkd[813]: eth0: Gained IPv6LL Jan 15 00:21:35.196502 systemd-fsck[967]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 15 00:21:35.199431 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 15 00:21:35.200000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.201843 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 15 00:21:35.206867 kernel: audit: type=1130 audit(1768436495.200:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.325226 kernel: EXT4-fs (vda9): mounted filesystem 05dab3f9-40c2-46d9-a2a2-3da8ed7c4451 r/w with ordered data mode. Quota mode: none. Jan 15 00:21:35.325317 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 15 00:21:35.326547 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 15 00:21:35.330213 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 00:21:35.332817 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 15 00:21:35.333839 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 15 00:21:35.342311 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 15 00:21:35.343491 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 15 00:21:35.343530 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 00:21:35.345569 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 15 00:21:35.347974 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 15 00:21:35.361223 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (976) Jan 15 00:21:35.364069 kernel: BTRFS info (device vda6): first mount of filesystem 0eb28982-35f7-4b76-8133-b752f60f3941 Jan 15 00:21:35.364103 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 15 00:21:35.370381 kernel: BTRFS info (device vda6): turning on async discard Jan 15 00:21:35.370431 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 00:21:35.372553 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 00:21:35.408212 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:35.419577 initrd-setup-root[1004]: cut: /sysroot/etc/passwd: No such file or directory Jan 15 00:21:35.426501 initrd-setup-root[1011]: cut: /sysroot/etc/group: No such file or directory Jan 15 00:21:35.430738 initrd-setup-root[1018]: cut: /sysroot/etc/shadow: No such file or directory Jan 15 00:21:35.436619 initrd-setup-root[1025]: cut: /sysroot/etc/gshadow: No such file or directory Jan 15 00:21:35.544668 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 15 00:21:35.545000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.547065 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 15 00:21:35.551474 kernel: audit: type=1130 audit(1768436495.545:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.551653 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 15 00:21:35.565521 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 15 00:21:35.567386 kernel: BTRFS info (device vda6): last unmount of filesystem 0eb28982-35f7-4b76-8133-b752f60f3941 Jan 15 00:21:35.591808 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 15 00:21:35.592000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.596186 kernel: audit: type=1130 audit(1768436495.592:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.597028 ignition[1093]: INFO : Ignition 2.22.0 Jan 15 00:21:35.597028 ignition[1093]: INFO : Stage: mount Jan 15 00:21:35.599655 ignition[1093]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 00:21:35.599655 ignition[1093]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 00:21:35.599655 ignition[1093]: INFO : mount: mount passed Jan 15 00:21:35.599655 ignition[1093]: INFO : Ignition finished successfully Jan 15 00:21:35.606390 kernel: audit: type=1130 audit(1768436495.601:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.601000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:35.600574 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 15 00:21:36.467250 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:38.477265 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:42.487255 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:42.494937 coreos-metadata[978]: Jan 15 00:21:42.494 WARN failed to locate config-drive, using the metadata service API instead Jan 15 00:21:42.513389 coreos-metadata[978]: Jan 15 00:21:42.513 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 00:21:44.894680 coreos-metadata[978]: Jan 15 00:21:44.894 INFO Fetch successful Jan 15 00:21:44.896060 coreos-metadata[978]: Jan 15 00:21:44.895 INFO wrote hostname ci-4515-1-0-n-1ddc109f0f to /sysroot/etc/hostname Jan 15 00:21:44.898102 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 15 00:21:44.905025 kernel: audit: type=1130 audit(1768436504.899:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:44.905051 kernel: audit: type=1131 audit(1768436504.899:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:44.899000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:44.899000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:44.898239 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 15 00:21:44.900391 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 15 00:21:44.921537 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 00:21:44.947195 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1111) Jan 15 00:21:44.951195 kernel: BTRFS info (device vda6): first mount of filesystem 0eb28982-35f7-4b76-8133-b752f60f3941 Jan 15 00:21:44.951229 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 15 00:21:44.955604 kernel: BTRFS info (device vda6): turning on async discard Jan 15 00:21:44.955624 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 00:21:44.957037 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 00:21:44.989067 ignition[1129]: INFO : Ignition 2.22.0 Jan 15 00:21:44.989067 ignition[1129]: INFO : Stage: files Jan 15 00:21:44.990702 ignition[1129]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 00:21:44.990702 ignition[1129]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 00:21:44.990702 ignition[1129]: DEBUG : files: compiled without relabeling support, skipping Jan 15 00:21:44.993851 ignition[1129]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 15 00:21:44.993851 ignition[1129]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 15 00:21:44.999729 ignition[1129]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 15 00:21:45.001051 ignition[1129]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 15 00:21:45.001051 ignition[1129]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 15 00:21:45.000358 unknown[1129]: wrote ssh authorized keys file for user: core Jan 15 00:21:45.004695 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jan 15 00:21:45.004695 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Jan 15 00:21:45.061125 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 15 00:21:45.177619 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jan 15 00:21:45.177619 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 00:21:45.181507 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 15 00:21:45.198096 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 15 00:21:45.198096 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 15 00:21:45.198096 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Jan 15 00:21:45.465660 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 15 00:21:46.043284 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 15 00:21:46.043284 ignition[1129]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 15 00:21:46.047641 ignition[1129]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 00:21:46.051532 ignition[1129]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 00:21:46.051532 ignition[1129]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 15 00:21:46.051532 ignition[1129]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 15 00:21:46.061913 kernel: audit: type=1130 audit(1768436506.057:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.057000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.061981 ignition[1129]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 15 00:21:46.061981 ignition[1129]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 15 00:21:46.061981 ignition[1129]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 15 00:21:46.061981 ignition[1129]: INFO : files: files passed Jan 15 00:21:46.061981 ignition[1129]: INFO : Ignition finished successfully Jan 15 00:21:46.055975 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 15 00:21:46.058639 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 15 00:21:46.072842 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 15 00:21:46.075872 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 15 00:21:46.075992 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 15 00:21:46.084735 kernel: audit: type=1130 audit(1768436506.078:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.084763 kernel: audit: type=1131 audit(1768436506.078:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.078000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.078000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.087314 initrd-setup-root-after-ignition[1162]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 00:21:46.087314 initrd-setup-root-after-ignition[1162]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 15 00:21:46.090984 initrd-setup-root-after-ignition[1166]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 00:21:46.091846 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 00:21:46.095000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.095693 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 15 00:21:46.102057 kernel: audit: type=1130 audit(1768436506.095:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.101799 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 15 00:21:46.143958 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 15 00:21:46.144078 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 15 00:21:46.151625 kernel: audit: type=1130 audit(1768436506.145:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.151651 kernel: audit: type=1131 audit(1768436506.145:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.145000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.145000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.146312 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 15 00:21:46.152522 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 15 00:21:46.154333 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 15 00:21:46.155340 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 15 00:21:46.171025 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 00:21:46.175843 kernel: audit: type=1130 audit(1768436506.172:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.172000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.173618 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 15 00:21:46.202494 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 15 00:21:46.202709 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 15 00:21:46.204961 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 00:21:46.206889 systemd[1]: Stopped target timers.target - Timer Units. Jan 15 00:21:46.208555 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 15 00:21:46.213209 kernel: audit: type=1131 audit(1768436506.210:50): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.210000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.208692 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 00:21:46.213348 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 15 00:21:46.215452 systemd[1]: Stopped target basic.target - Basic System. Jan 15 00:21:46.217007 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 15 00:21:46.218617 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 00:21:46.220368 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 15 00:21:46.222224 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 15 00:21:46.224166 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 15 00:21:46.225995 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 00:21:46.227864 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 15 00:21:46.229690 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 15 00:21:46.231258 systemd[1]: Stopped target swap.target - Swaps. Jan 15 00:21:46.232728 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 15 00:21:46.234000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.232861 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 15 00:21:46.234999 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 15 00:21:46.236906 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 00:21:46.238691 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 15 00:21:46.242335 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 00:21:46.243787 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 15 00:21:46.245000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.243908 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 15 00:21:46.246566 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 15 00:21:46.248000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.246687 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 00:21:46.249000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.248509 systemd[1]: ignition-files.service: Deactivated successfully. Jan 15 00:21:46.248616 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 15 00:21:46.251087 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 15 00:21:46.255000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.252696 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 15 00:21:46.256000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.253485 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 15 00:21:46.258000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.253612 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 00:21:46.255362 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 15 00:21:46.255487 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 00:21:46.257090 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 15 00:21:46.257211 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 00:21:46.262352 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 15 00:21:46.278497 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 15 00:21:46.279000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.279000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.295453 ignition[1186]: INFO : Ignition 2.22.0 Jan 15 00:21:46.295453 ignition[1186]: INFO : Stage: umount Jan 15 00:21:46.297268 ignition[1186]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 00:21:46.297268 ignition[1186]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 00:21:46.297268 ignition[1186]: INFO : umount: umount passed Jan 15 00:21:46.297268 ignition[1186]: INFO : Ignition finished successfully Jan 15 00:21:46.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.295850 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 15 00:21:46.304000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.298522 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 15 00:21:46.306000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.298623 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 15 00:21:46.311000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.302816 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 15 00:21:46.302919 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 15 00:21:46.315000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.304704 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 15 00:21:46.304752 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 15 00:21:46.306616 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 15 00:21:46.306668 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 15 00:21:46.312093 systemd[1]: Stopped target network.target - Network. Jan 15 00:21:46.313653 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 15 00:21:46.313724 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 00:21:46.315375 systemd[1]: Stopped target paths.target - Path Units. Jan 15 00:21:46.316818 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 15 00:21:46.321278 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 00:21:46.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.322804 systemd[1]: Stopped target slices.target - Slice Units. Jan 15 00:21:46.336000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.324357 systemd[1]: Stopped target sockets.target - Socket Units. Jan 15 00:21:46.325815 systemd[1]: iscsid.socket: Deactivated successfully. Jan 15 00:21:46.325857 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 00:21:46.340000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.327786 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 15 00:21:46.342000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.327825 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 00:21:46.330382 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 15 00:21:46.345000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.330407 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 15 00:21:46.332152 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 15 00:21:46.332225 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 15 00:21:46.334291 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 15 00:21:46.349000 audit: BPF prog-id=6 op=UNLOAD Jan 15 00:21:46.334336 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 15 00:21:46.336736 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 15 00:21:46.338145 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 15 00:21:46.352000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.340007 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 15 00:21:46.340086 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 15 00:21:46.341553 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 15 00:21:46.341645 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 15 00:21:46.344197 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 15 00:21:46.344298 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 15 00:21:46.351455 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 15 00:21:46.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.351579 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 15 00:21:46.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.355956 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 15 00:21:46.365000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.357239 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 15 00:21:46.357283 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 15 00:21:46.369000 audit: BPF prog-id=9 op=UNLOAD Jan 15 00:21:46.359856 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 15 00:21:46.360760 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 15 00:21:46.360820 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 00:21:46.362733 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 15 00:21:46.362798 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 15 00:21:46.375000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.364346 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 15 00:21:46.364390 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 15 00:21:46.366183 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 00:21:46.380000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.374028 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 15 00:21:46.382000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.374169 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 00:21:46.384000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.376276 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 15 00:21:46.376313 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 15 00:21:46.377790 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 15 00:21:46.390000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.377823 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 00:21:46.391000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.379496 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 15 00:21:46.393000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.379549 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 15 00:21:46.395000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.382215 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 15 00:21:46.397000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.382264 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 15 00:21:46.383894 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 15 00:21:46.383947 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 00:21:46.387010 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 15 00:21:46.388195 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 15 00:21:46.388272 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 00:21:46.390214 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 15 00:21:46.390270 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 00:21:46.392139 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 15 00:21:46.392203 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 00:21:46.394081 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 15 00:21:46.394125 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 00:21:46.395972 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 00:21:46.396022 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 00:21:46.424465 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 15 00:21:46.424602 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 15 00:21:46.426000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.426000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.426871 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 15 00:21:46.428216 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 15 00:21:46.429908 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 15 00:21:46.429000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:46.431895 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 15 00:21:46.441484 systemd[1]: Switching root. Jan 15 00:21:46.491306 systemd-journald[417]: Journal stopped Jan 15 00:21:47.406501 systemd-journald[417]: Received SIGTERM from PID 1 (systemd). Jan 15 00:21:47.406593 kernel: SELinux: policy capability network_peer_controls=1 Jan 15 00:21:47.406610 kernel: SELinux: policy capability open_perms=1 Jan 15 00:21:47.406624 kernel: SELinux: policy capability extended_socket_class=1 Jan 15 00:21:47.406639 kernel: SELinux: policy capability always_check_network=0 Jan 15 00:21:47.406650 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 15 00:21:47.406664 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 15 00:21:47.406677 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 15 00:21:47.406691 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 15 00:21:47.406701 kernel: SELinux: policy capability userspace_initial_context=0 Jan 15 00:21:47.406712 systemd[1]: Successfully loaded SELinux policy in 66.719ms. Jan 15 00:21:47.406731 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.939ms. Jan 15 00:21:47.406743 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 15 00:21:47.406777 systemd[1]: Detected virtualization kvm. Jan 15 00:21:47.406789 systemd[1]: Detected architecture arm64. Jan 15 00:21:47.406803 systemd[1]: Detected first boot. Jan 15 00:21:47.406814 systemd[1]: Hostname set to . Jan 15 00:21:47.406825 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 15 00:21:47.406836 zram_generator::config[1233]: No configuration found. Jan 15 00:21:47.406856 kernel: NET: Registered PF_VSOCK protocol family Jan 15 00:21:47.406866 systemd[1]: Populated /etc with preset unit settings. Jan 15 00:21:47.406877 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 15 00:21:47.406888 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 15 00:21:47.406899 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 15 00:21:47.406911 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 15 00:21:47.406922 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 15 00:21:47.406935 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 15 00:21:47.406946 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 15 00:21:47.406958 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 15 00:21:47.406969 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 15 00:21:47.406980 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 15 00:21:47.406991 systemd[1]: Created slice user.slice - User and Session Slice. Jan 15 00:21:47.407002 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 00:21:47.407014 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 00:21:47.407026 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 15 00:21:47.407036 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 15 00:21:47.407047 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 15 00:21:47.407061 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 00:21:47.407072 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 15 00:21:47.407083 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 00:21:47.407096 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 00:21:47.407107 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 15 00:21:47.407118 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 15 00:21:47.407129 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 15 00:21:47.407140 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 15 00:21:47.407153 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 00:21:47.407164 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 00:21:47.407191 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 15 00:21:47.407203 systemd[1]: Reached target slices.target - Slice Units. Jan 15 00:21:47.407214 systemd[1]: Reached target swap.target - Swaps. Jan 15 00:21:47.407238 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 15 00:21:47.407251 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 15 00:21:47.407264 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 15 00:21:47.407275 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 15 00:21:47.407286 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 15 00:21:47.407297 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 00:21:47.407308 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 15 00:21:47.407319 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 15 00:21:47.407330 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 00:21:47.407342 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 00:21:47.407354 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 15 00:21:47.407365 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 15 00:21:47.407376 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 15 00:21:47.407387 systemd[1]: Mounting media.mount - External Media Directory... Jan 15 00:21:47.407398 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 15 00:21:47.407409 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 15 00:21:47.407421 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 15 00:21:47.407433 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 15 00:21:47.407444 systemd[1]: Reached target machines.target - Containers. Jan 15 00:21:47.407455 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 15 00:21:47.407466 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 00:21:47.407477 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 00:21:47.407489 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 15 00:21:47.407501 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 00:21:47.407512 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 00:21:47.407523 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 00:21:47.407538 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 15 00:21:47.407550 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 00:21:47.407562 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 15 00:21:47.407576 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 15 00:21:47.407588 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 15 00:21:47.407599 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 15 00:21:47.407612 systemd[1]: Stopped systemd-fsck-usr.service. Jan 15 00:21:47.407625 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 00:21:47.407637 kernel: fuse: init (API version 7.41) Jan 15 00:21:47.407647 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 00:21:47.407658 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 00:21:47.407669 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 15 00:21:47.407680 kernel: ACPI: bus type drm_connector registered Jan 15 00:21:47.407690 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 15 00:21:47.407703 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 15 00:21:47.407714 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 00:21:47.407725 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 15 00:21:47.407736 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 15 00:21:47.407746 systemd[1]: Mounted media.mount - External Media Directory. Jan 15 00:21:47.407757 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 15 00:21:47.407768 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 15 00:21:47.407781 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 15 00:21:47.407791 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 15 00:21:47.407806 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 00:21:47.407842 systemd-journald[1305]: Collecting audit messages is enabled. Jan 15 00:21:47.407871 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 15 00:21:47.407882 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 15 00:21:47.407894 systemd-journald[1305]: Journal started Jan 15 00:21:47.407914 systemd-journald[1305]: Runtime Journal (/run/log/journal/793873be56a44770b8aaf5b0fa3487c7) is 8M, max 319.5M, 311.5M free. Jan 15 00:21:47.248000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 15 00:21:47.340000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.342000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.345000 audit: BPF prog-id=14 op=UNLOAD Jan 15 00:21:47.345000 audit: BPF prog-id=13 op=UNLOAD Jan 15 00:21:47.346000 audit: BPF prog-id=15 op=LOAD Jan 15 00:21:47.346000 audit: BPF prog-id=16 op=LOAD Jan 15 00:21:47.346000 audit: BPF prog-id=17 op=LOAD Jan 15 00:21:47.402000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.404000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 15 00:21:47.404000 audit[1305]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=4 a1=fffff2d2f020 a2=4000 a3=0 items=0 ppid=1 pid=1305 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:21:47.404000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 15 00:21:47.405000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.148504 systemd[1]: Queued start job for default target multi-user.target. Jan 15 00:21:47.172775 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 15 00:21:47.173221 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 15 00:21:47.410000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.410000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.412201 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 00:21:47.413135 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 00:21:47.414220 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 00:21:47.415000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.415000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.416019 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 00:21:47.416261 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 00:21:47.417000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.417000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.417515 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 00:21:47.417684 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 00:21:47.418000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.418000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.419218 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 15 00:21:47.419382 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 15 00:21:47.420000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.420000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.420736 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 00:21:47.420905 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 00:21:47.421000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.421000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.422313 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 00:21:47.423000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.423948 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 00:21:47.425000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.426081 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 15 00:21:47.427000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.427831 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 15 00:21:47.429000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.440371 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 15 00:21:47.442563 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 15 00:21:47.443802 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 15 00:21:47.443839 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 00:21:47.445746 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 15 00:21:47.447189 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 00:21:47.447298 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 15 00:21:47.449724 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 15 00:21:47.451860 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 15 00:21:47.453072 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 00:21:47.466349 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 15 00:21:47.467576 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 00:21:47.468641 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 00:21:47.470525 systemd-journald[1305]: Time spent on flushing to /var/log/journal/793873be56a44770b8aaf5b0fa3487c7 is 23.547ms for 1817 entries. Jan 15 00:21:47.470525 systemd-journald[1305]: System Journal (/var/log/journal/793873be56a44770b8aaf5b0fa3487c7) is 8M, max 588.1M, 580.1M free. Jan 15 00:21:47.500691 systemd-journald[1305]: Received client request to flush runtime journal. Jan 15 00:21:47.500727 kernel: loop1: detected capacity change from 0 to 100192 Jan 15 00:21:47.486000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.491000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.471964 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 15 00:21:47.474490 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 15 00:21:47.483695 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 00:21:47.489559 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 15 00:21:47.491366 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 15 00:21:47.495670 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 15 00:21:47.503230 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 15 00:21:47.504000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.510810 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 00:21:47.511724 systemd-tmpfiles[1352]: ACLs are not supported, ignoring. Jan 15 00:21:47.511743 systemd-tmpfiles[1352]: ACLs are not supported, ignoring. Jan 15 00:21:47.511000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.517792 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 00:21:47.520000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.521772 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 15 00:21:47.540304 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 15 00:21:47.541000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.553224 kernel: loop2: detected capacity change from 0 to 109872 Jan 15 00:21:47.561433 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 15 00:21:47.562000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.563000 audit: BPF prog-id=18 op=LOAD Jan 15 00:21:47.564000 audit: BPF prog-id=19 op=LOAD Jan 15 00:21:47.564000 audit: BPF prog-id=20 op=LOAD Jan 15 00:21:47.565223 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 15 00:21:47.567000 audit: BPF prog-id=21 op=LOAD Jan 15 00:21:47.568005 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 00:21:47.571525 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 00:21:47.576000 audit: BPF prog-id=22 op=LOAD Jan 15 00:21:47.576000 audit: BPF prog-id=23 op=LOAD Jan 15 00:21:47.576000 audit: BPF prog-id=24 op=LOAD Jan 15 00:21:47.577782 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 15 00:21:47.594000 audit: BPF prog-id=25 op=LOAD Jan 15 00:21:47.594000 audit: BPF prog-id=26 op=LOAD Jan 15 00:21:47.594000 audit: BPF prog-id=27 op=LOAD Jan 15 00:21:47.595611 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 15 00:21:47.601200 kernel: loop3: detected capacity change from 0 to 207008 Jan 15 00:21:47.604920 systemd-tmpfiles[1374]: ACLs are not supported, ignoring. Jan 15 00:21:47.604931 systemd-tmpfiles[1374]: ACLs are not supported, ignoring. Jan 15 00:21:47.609678 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 00:21:47.610000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.626226 systemd-nsresourced[1376]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 15 00:21:47.628000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.627322 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 15 00:21:47.634165 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 15 00:21:47.635000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.639344 kernel: loop4: detected capacity change from 0 to 1648 Jan 15 00:21:47.666212 kernel: loop5: detected capacity change from 0 to 100192 Jan 15 00:21:47.684222 kernel: loop6: detected capacity change from 0 to 109872 Jan 15 00:21:47.686660 systemd-oomd[1372]: No swap; memory pressure usage will be degraded Jan 15 00:21:47.687606 systemd-resolved[1373]: Positive Trust Anchors: Jan 15 00:21:47.687627 systemd-resolved[1373]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 00:21:47.687631 systemd-resolved[1373]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 15 00:21:47.687636 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 15 00:21:47.687663 systemd-resolved[1373]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 00:21:47.688000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.700768 systemd-resolved[1373]: Using system hostname 'ci-4515-1-0-n-1ddc109f0f'. Jan 15 00:21:47.702821 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 00:21:47.703000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.704040 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 00:21:47.705195 kernel: loop7: detected capacity change from 0 to 207008 Jan 15 00:21:47.726202 kernel: loop1: detected capacity change from 0 to 1648 Jan 15 00:21:47.730386 (sd-merge)[1398]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 15 00:21:47.733370 (sd-merge)[1398]: Merged extensions into '/usr'. Jan 15 00:21:47.737744 systemd[1]: Reload requested from client PID 1351 ('systemd-sysext') (unit systemd-sysext.service)... Jan 15 00:21:47.737832 systemd[1]: Reloading... Jan 15 00:21:47.792432 zram_generator::config[1430]: No configuration found. Jan 15 00:21:47.940739 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 15 00:21:47.940864 systemd[1]: Reloading finished in 202 ms. Jan 15 00:21:47.975428 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 15 00:21:47.976000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.978199 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 15 00:21:47.979000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:47.994735 systemd[1]: Starting ensure-sysext.service... Jan 15 00:21:47.996451 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 00:21:47.997000 audit: BPF prog-id=8 op=UNLOAD Jan 15 00:21:47.997000 audit: BPF prog-id=7 op=UNLOAD Jan 15 00:21:47.997000 audit: BPF prog-id=28 op=LOAD Jan 15 00:21:47.997000 audit: BPF prog-id=29 op=LOAD Jan 15 00:21:47.998733 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 00:21:48.000000 audit: BPF prog-id=30 op=LOAD Jan 15 00:21:48.000000 audit: BPF prog-id=15 op=UNLOAD Jan 15 00:21:48.000000 audit: BPF prog-id=31 op=LOAD Jan 15 00:21:48.000000 audit: BPF prog-id=32 op=LOAD Jan 15 00:21:48.000000 audit: BPF prog-id=16 op=UNLOAD Jan 15 00:21:48.000000 audit: BPF prog-id=17 op=UNLOAD Jan 15 00:21:48.001000 audit: BPF prog-id=33 op=LOAD Jan 15 00:21:48.001000 audit: BPF prog-id=25 op=UNLOAD Jan 15 00:21:48.001000 audit: BPF prog-id=34 op=LOAD Jan 15 00:21:48.001000 audit: BPF prog-id=35 op=LOAD Jan 15 00:21:48.001000 audit: BPF prog-id=26 op=UNLOAD Jan 15 00:21:48.001000 audit: BPF prog-id=27 op=UNLOAD Jan 15 00:21:48.002000 audit: BPF prog-id=36 op=LOAD Jan 15 00:21:48.002000 audit: BPF prog-id=18 op=UNLOAD Jan 15 00:21:48.002000 audit: BPF prog-id=37 op=LOAD Jan 15 00:21:48.002000 audit: BPF prog-id=38 op=LOAD Jan 15 00:21:48.002000 audit: BPF prog-id=19 op=UNLOAD Jan 15 00:21:48.002000 audit: BPF prog-id=20 op=UNLOAD Jan 15 00:21:48.002000 audit: BPF prog-id=39 op=LOAD Jan 15 00:21:48.002000 audit: BPF prog-id=22 op=UNLOAD Jan 15 00:21:48.002000 audit: BPF prog-id=40 op=LOAD Jan 15 00:21:48.002000 audit: BPF prog-id=41 op=LOAD Jan 15 00:21:48.002000 audit: BPF prog-id=23 op=UNLOAD Jan 15 00:21:48.003000 audit: BPF prog-id=24 op=UNLOAD Jan 15 00:21:48.003000 audit: BPF prog-id=42 op=LOAD Jan 15 00:21:48.003000 audit: BPF prog-id=21 op=UNLOAD Jan 15 00:21:48.009906 systemd[1]: Reload requested from client PID 1465 ('systemctl') (unit ensure-sysext.service)... Jan 15 00:21:48.009924 systemd[1]: Reloading... Jan 15 00:21:48.017413 systemd-tmpfiles[1466]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 15 00:21:48.017726 systemd-tmpfiles[1466]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 15 00:21:48.018126 systemd-tmpfiles[1466]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 15 00:21:48.019344 systemd-tmpfiles[1466]: ACLs are not supported, ignoring. Jan 15 00:21:48.019511 systemd-tmpfiles[1466]: ACLs are not supported, ignoring. Jan 15 00:21:48.023616 systemd-udevd[1467]: Using default interface naming scheme 'v257'. Jan 15 00:21:48.028576 systemd-tmpfiles[1466]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 00:21:48.028589 systemd-tmpfiles[1466]: Skipping /boot Jan 15 00:21:48.035297 systemd-tmpfiles[1466]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 00:21:48.035389 systemd-tmpfiles[1466]: Skipping /boot Jan 15 00:21:48.062215 zram_generator::config[1499]: No configuration found. Jan 15 00:21:48.201242 kernel: mousedev: PS/2 mouse device common for all mice Jan 15 00:21:48.245218 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 00:21:48.246672 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 15 00:21:48.246898 systemd[1]: Reloading finished in 236 ms. Jan 15 00:21:48.259420 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 15 00:21:48.259481 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 15 00:21:48.259515 kernel: [drm] features: -context_init Jan 15 00:21:48.260510 kernel: [drm] number of scanouts: 1 Jan 15 00:21:48.260022 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 00:21:48.261000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.263000 audit: BPF prog-id=43 op=LOAD Jan 15 00:21:48.263000 audit: BPF prog-id=44 op=LOAD Jan 15 00:21:48.263000 audit: BPF prog-id=28 op=UNLOAD Jan 15 00:21:48.263000 audit: BPF prog-id=29 op=UNLOAD Jan 15 00:21:48.264000 audit: BPF prog-id=45 op=LOAD Jan 15 00:21:48.264000 audit: BPF prog-id=36 op=UNLOAD Jan 15 00:21:48.264000 audit: BPF prog-id=46 op=LOAD Jan 15 00:21:48.265195 kernel: [drm] number of cap sets: 0 Jan 15 00:21:48.266000 audit: BPF prog-id=47 op=LOAD Jan 15 00:21:48.266000 audit: BPF prog-id=37 op=UNLOAD Jan 15 00:21:48.266000 audit: BPF prog-id=38 op=UNLOAD Jan 15 00:21:48.266000 audit: BPF prog-id=48 op=LOAD Jan 15 00:21:48.266000 audit: BPF prog-id=42 op=UNLOAD Jan 15 00:21:48.267000 audit: BPF prog-id=49 op=LOAD Jan 15 00:21:48.267000 audit: BPF prog-id=39 op=UNLOAD Jan 15 00:21:48.267000 audit: BPF prog-id=50 op=LOAD Jan 15 00:21:48.267000 audit: BPF prog-id=51 op=LOAD Jan 15 00:21:48.267000 audit: BPF prog-id=40 op=UNLOAD Jan 15 00:21:48.267000 audit: BPF prog-id=41 op=UNLOAD Jan 15 00:21:48.268000 audit: BPF prog-id=52 op=LOAD Jan 15 00:21:48.268000 audit: BPF prog-id=30 op=UNLOAD Jan 15 00:21:48.268000 audit: BPF prog-id=53 op=LOAD Jan 15 00:21:48.268000 audit: BPF prog-id=54 op=LOAD Jan 15 00:21:48.268000 audit: BPF prog-id=31 op=UNLOAD Jan 15 00:21:48.268000 audit: BPF prog-id=32 op=UNLOAD Jan 15 00:21:48.273198 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 15 00:21:48.284000 audit: BPF prog-id=55 op=LOAD Jan 15 00:21:48.284000 audit: BPF prog-id=33 op=UNLOAD Jan 15 00:21:48.284000 audit: BPF prog-id=56 op=LOAD Jan 15 00:21:48.284000 audit: BPF prog-id=57 op=LOAD Jan 15 00:21:48.284000 audit: BPF prog-id=34 op=UNLOAD Jan 15 00:21:48.284000 audit: BPF prog-id=35 op=UNLOAD Jan 15 00:21:48.288908 kernel: Console: switching to colour frame buffer device 160x50 Jan 15 00:21:48.294407 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 00:21:48.295000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.297208 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 15 00:21:48.316216 systemd[1]: Finished ensure-sysext.service. Jan 15 00:21:48.317000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.336345 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 15 00:21:48.338479 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 15 00:21:48.339842 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 00:21:48.350426 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 15 00:21:48.353377 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 00:21:48.355588 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 00:21:48.357719 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 00:21:48.362488 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 15 00:21:48.366581 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 00:21:48.368801 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 15 00:21:48.370227 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 00:21:48.370343 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 15 00:21:48.371491 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 15 00:21:48.375832 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 15 00:21:48.377319 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 00:21:48.378897 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 15 00:21:48.382034 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 15 00:21:48.382082 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 15 00:21:48.381000 audit: BPF prog-id=58 op=LOAD Jan 15 00:21:48.382805 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 00:21:48.384500 systemd[1]: Reached target time-set.target - System Time Set. Jan 15 00:21:48.387211 kernel: PTP clock support registered Jan 15 00:21:48.391427 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 15 00:21:48.393838 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 00:21:48.396126 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 15 00:21:48.397525 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 15 00:21:48.398000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.400000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.400568 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 00:21:48.404453 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 00:21:48.406000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.406000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.406815 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 00:21:48.407014 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 00:21:48.408000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.408000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.409292 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 00:21:48.409750 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 00:21:48.410000 audit[1613]: SYSTEM_BOOT pid=1613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.411000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.413588 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 15 00:21:48.413818 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 15 00:21:48.414000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.414000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.415271 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 00:21:48.417277 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 00:21:48.418000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.418000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.418526 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 15 00:21:48.418719 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 15 00:21:48.419000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.419000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.420318 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 15 00:21:48.421000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.441288 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 15 00:21:48.442000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.443476 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 15 00:21:48.444000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:21:48.449392 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 15 00:21:48.454447 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 15 00:21:48.455000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 15 00:21:48.455000 audit[1638]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe2ccca60 a2=420 a3=0 items=0 ppid=1587 pid=1638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:21:48.455000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 15 00:21:48.457520 augenrules[1638]: No rules Jan 15 00:21:48.457697 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 00:21:48.457872 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 00:21:48.460039 systemd[1]: audit-rules.service: Deactivated successfully. Jan 15 00:21:48.462731 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 15 00:21:48.468694 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 15 00:21:48.470615 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 15 00:21:48.488302 systemd-networkd[1612]: lo: Link UP Jan 15 00:21:48.488309 systemd-networkd[1612]: lo: Gained carrier Jan 15 00:21:48.489492 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 00:21:48.489820 systemd-networkd[1612]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 00:21:48.489829 systemd-networkd[1612]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 00:21:48.490263 systemd-networkd[1612]: eth0: Link UP Jan 15 00:21:48.490468 systemd-networkd[1612]: eth0: Gained carrier Jan 15 00:21:48.490484 systemd-networkd[1612]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 00:21:48.491321 systemd[1]: Reached target network.target - Network. Jan 15 00:21:48.496347 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 15 00:21:48.498403 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 15 00:21:48.509027 systemd-networkd[1612]: eth0: DHCPv4 address 10.0.3.29/25, gateway 10.0.3.1 acquired from 10.0.3.1 Jan 15 00:21:48.516281 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 00:21:48.518902 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 15 00:21:48.554665 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 15 00:21:48.556154 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 15 00:21:49.021377 ldconfig[1603]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 15 00:21:49.028076 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 15 00:21:49.030641 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 15 00:21:49.063558 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 15 00:21:49.064964 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 00:21:49.066261 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 15 00:21:49.067499 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 15 00:21:49.068899 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 15 00:21:49.070092 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 15 00:21:49.071333 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 15 00:21:49.072516 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 15 00:21:49.073518 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 15 00:21:49.074622 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 15 00:21:49.074658 systemd[1]: Reached target paths.target - Path Units. Jan 15 00:21:49.075498 systemd[1]: Reached target timers.target - Timer Units. Jan 15 00:21:49.078259 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 15 00:21:49.080517 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 15 00:21:49.083250 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 15 00:21:49.084513 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 15 00:21:49.085643 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 15 00:21:49.090660 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 15 00:21:49.091912 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 15 00:21:49.093572 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 15 00:21:49.094622 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 00:21:49.095513 systemd[1]: Reached target basic.target - Basic System. Jan 15 00:21:49.096386 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 15 00:21:49.096419 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 15 00:21:49.098923 systemd[1]: Starting chronyd.service - NTP client/server... Jan 15 00:21:49.100656 systemd[1]: Starting containerd.service - containerd container runtime... Jan 15 00:21:49.102802 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 15 00:21:49.106347 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 15 00:21:49.108167 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 15 00:21:49.111157 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 15 00:21:49.112193 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:49.115334 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 15 00:21:49.116301 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 15 00:21:49.119350 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 15 00:21:49.126369 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 15 00:21:49.129367 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 15 00:21:49.132855 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 15 00:21:49.134989 jq[1669]: false Jan 15 00:21:49.136377 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 15 00:21:49.137921 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 15 00:21:49.139428 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 15 00:21:49.140384 systemd[1]: Starting update-engine.service - Update Engine... Jan 15 00:21:49.142642 extend-filesystems[1671]: Found /dev/vda6 Jan 15 00:21:49.142887 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 15 00:21:49.146401 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 15 00:21:49.148506 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 15 00:21:49.148738 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 15 00:21:49.151668 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 15 00:21:49.151882 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 15 00:21:49.153988 extend-filesystems[1671]: Found /dev/vda9 Jan 15 00:21:49.158440 extend-filesystems[1671]: Checking size of /dev/vda9 Jan 15 00:21:49.159972 jq[1683]: true Jan 15 00:21:49.162584 systemd[1]: motdgen.service: Deactivated successfully. Jan 15 00:21:49.161457 chronyd[1663]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 15 00:21:49.162874 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 15 00:21:49.164636 chronyd[1663]: Loaded seccomp filter (level 2) Jan 15 00:21:49.165582 systemd[1]: Started chronyd.service - NTP client/server. Jan 15 00:21:49.183816 extend-filesystems[1671]: Resized partition /dev/vda9 Jan 15 00:21:49.187614 tar[1692]: linux-arm64/LICENSE Jan 15 00:21:49.187614 tar[1692]: linux-arm64/helm Jan 15 00:21:49.189467 jq[1705]: true Jan 15 00:21:49.193169 update_engine[1682]: I20260115 00:21:49.192792 1682 main.cc:92] Flatcar Update Engine starting Jan 15 00:21:49.197385 extend-filesystems[1718]: resize2fs 1.47.3 (8-Jul-2025) Jan 15 00:21:49.201080 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 15 00:21:49.230768 dbus-daemon[1666]: [system] SELinux support is enabled Jan 15 00:21:49.231042 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 15 00:21:49.234760 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 15 00:21:49.234798 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 15 00:21:49.237015 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 15 00:21:49.237042 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 15 00:21:49.244725 systemd[1]: Started update-engine.service - Update Engine. Jan 15 00:21:49.246553 update_engine[1682]: I20260115 00:21:49.244769 1682 update_check_scheduler.cc:74] Next update check in 6m28s Jan 15 00:21:49.247331 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 15 00:21:49.255049 systemd-logind[1681]: New seat seat0. Jan 15 00:21:49.258662 systemd-logind[1681]: Watching system buttons on /dev/input/event0 (Power Button) Jan 15 00:21:49.258818 systemd-logind[1681]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 15 00:21:49.259863 systemd[1]: Started systemd-logind.service - User Login Management. Jan 15 00:21:49.319656 locksmithd[1735]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 15 00:21:49.331715 containerd[1703]: time="2026-01-15T00:21:49Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 15 00:21:49.341180 containerd[1703]: time="2026-01-15T00:21:49.341140240Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 15 00:21:49.354636 containerd[1703]: time="2026-01-15T00:21:49.354593280Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.8µs" Jan 15 00:21:49.354636 containerd[1703]: time="2026-01-15T00:21:49.354631120Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 15 00:21:49.358617 containerd[1703]: time="2026-01-15T00:21:49.354674760Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 15 00:21:49.358617 containerd[1703]: time="2026-01-15T00:21:49.354686960Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 15 00:21:49.360099 bash[1740]: Updated "/home/core/.ssh/authorized_keys" Jan 15 00:21:49.361455 containerd[1703]: time="2026-01-15T00:21:49.361402480Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 15 00:21:49.361495 containerd[1703]: time="2026-01-15T00:21:49.361471040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 15 00:21:49.361573 containerd[1703]: time="2026-01-15T00:21:49.361547520Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 15 00:21:49.361573 containerd[1703]: time="2026-01-15T00:21:49.361570400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 15 00:21:49.362071 containerd[1703]: time="2026-01-15T00:21:49.362028040Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 15 00:21:49.362099 containerd[1703]: time="2026-01-15T00:21:49.362070400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 15 00:21:49.362122 containerd[1703]: time="2026-01-15T00:21:49.362097600Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 15 00:21:49.362122 containerd[1703]: time="2026-01-15T00:21:49.362112880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 15 00:21:49.362277 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 15 00:21:49.368181 containerd[1703]: time="2026-01-15T00:21:49.367586680Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 15 00:21:49.368181 containerd[1703]: time="2026-01-15T00:21:49.367622840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 15 00:21:49.368850 systemd[1]: Starting sshkeys.service... Jan 15 00:21:49.369800 containerd[1703]: time="2026-01-15T00:21:49.369769520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 15 00:21:49.370000 containerd[1703]: time="2026-01-15T00:21:49.369976640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 15 00:21:49.370058 containerd[1703]: time="2026-01-15T00:21:49.370014640Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 15 00:21:49.370058 containerd[1703]: time="2026-01-15T00:21:49.370025760Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 15 00:21:49.370118 containerd[1703]: time="2026-01-15T00:21:49.370062080Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 15 00:21:49.370313 containerd[1703]: time="2026-01-15T00:21:49.370293600Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 15 00:21:49.372344 containerd[1703]: time="2026-01-15T00:21:49.372309360Z" level=info msg="metadata content store policy set" policy=shared Jan 15 00:21:49.395330 containerd[1703]: time="2026-01-15T00:21:49.395287200Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 15 00:21:49.395656 containerd[1703]: time="2026-01-15T00:21:49.395462080Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 15 00:21:49.395949 containerd[1703]: time="2026-01-15T00:21:49.395917640Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 15 00:21:49.396025 containerd[1703]: time="2026-01-15T00:21:49.396005240Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 15 00:21:49.396071 containerd[1703]: time="2026-01-15T00:21:49.396033720Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 15 00:21:49.396071 containerd[1703]: time="2026-01-15T00:21:49.396057320Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 15 00:21:49.396143 containerd[1703]: time="2026-01-15T00:21:49.396119840Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 15 00:21:49.396184 containerd[1703]: time="2026-01-15T00:21:49.396143120Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396167400Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396264960Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396279600Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396290800Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396300880Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396329360Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396580960Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396620960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396636600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396647040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396657600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396767160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396785640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 15 00:21:49.396987 containerd[1703]: time="2026-01-15T00:21:49.396803480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 15 00:21:49.397273 containerd[1703]: time="2026-01-15T00:21:49.396822440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 15 00:21:49.397273 containerd[1703]: time="2026-01-15T00:21:49.397021680Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 15 00:21:49.397273 containerd[1703]: time="2026-01-15T00:21:49.397037640Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 15 00:21:49.397273 containerd[1703]: time="2026-01-15T00:21:49.397067280Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 15 00:21:49.397273 containerd[1703]: time="2026-01-15T00:21:49.397122240Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 15 00:21:49.397273 containerd[1703]: time="2026-01-15T00:21:49.397137120Z" level=info msg="Start snapshots syncer" Jan 15 00:21:49.397273 containerd[1703]: time="2026-01-15T00:21:49.397184080Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 15 00:21:49.399178 containerd[1703]: time="2026-01-15T00:21:49.397592960Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 15 00:21:49.399178 containerd[1703]: time="2026-01-15T00:21:49.397671480Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.397791480Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398085920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398135480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398150840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398161320Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398187280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398199400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398218160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398230360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398241400Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398333840Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398352160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 15 00:21:49.399311 containerd[1703]: time="2026-01-15T00:21:49.398423440Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 15 00:21:49.399512 containerd[1703]: time="2026-01-15T00:21:49.398437440Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 15 00:21:49.399512 containerd[1703]: time="2026-01-15T00:21:49.398497800Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 15 00:21:49.399512 containerd[1703]: time="2026-01-15T00:21:49.398510520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 15 00:21:49.399512 containerd[1703]: time="2026-01-15T00:21:49.398522840Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 15 00:21:49.399512 containerd[1703]: time="2026-01-15T00:21:49.398535600Z" level=info msg="runtime interface created" Jan 15 00:21:49.399512 containerd[1703]: time="2026-01-15T00:21:49.398542800Z" level=info msg="created NRI interface" Jan 15 00:21:49.399512 containerd[1703]: time="2026-01-15T00:21:49.398552080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 15 00:21:49.399512 containerd[1703]: time="2026-01-15T00:21:49.398576080Z" level=info msg="Connect containerd service" Jan 15 00:21:49.399512 containerd[1703]: time="2026-01-15T00:21:49.398658760Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 15 00:21:49.400962 containerd[1703]: time="2026-01-15T00:21:49.400927120Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 15 00:21:49.405698 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 15 00:21:49.409553 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 15 00:21:49.432192 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:49.492890 containerd[1703]: time="2026-01-15T00:21:49.492824320Z" level=info msg="Start subscribing containerd event" Jan 15 00:21:49.493141 containerd[1703]: time="2026-01-15T00:21:49.493110480Z" level=info msg="Start recovering state" Jan 15 00:21:49.493254 containerd[1703]: time="2026-01-15T00:21:49.493235920Z" level=info msg="Start event monitor" Jan 15 00:21:49.493282 containerd[1703]: time="2026-01-15T00:21:49.493259000Z" level=info msg="Start cni network conf syncer for default" Jan 15 00:21:49.493282 containerd[1703]: time="2026-01-15T00:21:49.493267560Z" level=info msg="Start streaming server" Jan 15 00:21:49.493282 containerd[1703]: time="2026-01-15T00:21:49.493276360Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 15 00:21:49.493330 containerd[1703]: time="2026-01-15T00:21:49.493283400Z" level=info msg="runtime interface starting up..." Jan 15 00:21:49.493330 containerd[1703]: time="2026-01-15T00:21:49.493289040Z" level=info msg="starting plugins..." Jan 15 00:21:49.493330 containerd[1703]: time="2026-01-15T00:21:49.493302280Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 15 00:21:49.495178 containerd[1703]: time="2026-01-15T00:21:49.493412600Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 15 00:21:49.495178 containerd[1703]: time="2026-01-15T00:21:49.493713280Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 15 00:21:49.495178 containerd[1703]: time="2026-01-15T00:21:49.493992200Z" level=info msg="containerd successfully booted in 0.162612s" Jan 15 00:21:49.494277 systemd[1]: Started containerd.service - containerd container runtime. Jan 15 00:21:49.506204 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 15 00:21:49.521462 extend-filesystems[1718]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 15 00:21:49.521462 extend-filesystems[1718]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 15 00:21:49.521462 extend-filesystems[1718]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 15 00:21:49.526596 extend-filesystems[1671]: Resized filesystem in /dev/vda9 Jan 15 00:21:49.524005 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 15 00:21:49.526283 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 15 00:21:49.631869 tar[1692]: linux-arm64/README.md Jan 15 00:21:49.648808 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 15 00:21:50.127200 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:50.353346 systemd-networkd[1612]: eth0: Gained IPv6LL Jan 15 00:21:50.356127 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 15 00:21:50.357866 systemd[1]: Reached target network-online.target - Network is Online. Jan 15 00:21:50.361398 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 00:21:50.364293 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 15 00:21:50.398887 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 15 00:21:50.450218 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:51.221032 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 00:21:51.222824 sshd_keygen[1693]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 15 00:21:51.234732 (kubelet)[1792]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 00:21:51.244757 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 15 00:21:51.247771 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 15 00:21:51.271276 systemd[1]: issuegen.service: Deactivated successfully. Jan 15 00:21:51.271581 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 15 00:21:51.275331 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 15 00:21:51.298341 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 15 00:21:51.302691 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 15 00:21:51.305890 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 15 00:21:51.308201 systemd[1]: Reached target getty.target - Login Prompts. Jan 15 00:21:51.755970 kubelet[1792]: E0115 00:21:51.755885 1792 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 00:21:51.758617 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 00:21:51.758765 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 00:21:51.761264 systemd[1]: kubelet.service: Consumed 763ms CPU time, 256.2M memory peak. Jan 15 00:21:52.137203 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:52.462277 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:54.146215 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 15 00:21:54.148436 systemd[1]: Started sshd@0-10.0.3.29:22-20.161.92.111:48350.service - OpenSSH per-connection server daemon (20.161.92.111:48350). Jan 15 00:21:54.712100 sshd[1819]: Accepted publickey for core from 20.161.92.111 port 48350 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:21:54.714555 sshd-session[1819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:21:54.721318 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 15 00:21:54.723379 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 15 00:21:54.728940 systemd-logind[1681]: New session 1 of user core. Jan 15 00:21:54.748753 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 15 00:21:54.752617 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 15 00:21:54.765516 (systemd)[1824]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 15 00:21:54.768153 systemd-logind[1681]: New session c1 of user core. Jan 15 00:21:54.881394 systemd[1824]: Queued start job for default target default.target. Jan 15 00:21:54.903166 systemd[1824]: Created slice app.slice - User Application Slice. Jan 15 00:21:54.903430 systemd[1824]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 15 00:21:54.903447 systemd[1824]: Reached target paths.target - Paths. Jan 15 00:21:54.903510 systemd[1824]: Reached target timers.target - Timers. Jan 15 00:21:54.904756 systemd[1824]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 15 00:21:54.905475 systemd[1824]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 15 00:21:54.914443 systemd[1824]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 15 00:21:54.914503 systemd[1824]: Reached target sockets.target - Sockets. Jan 15 00:21:54.917404 systemd[1824]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 15 00:21:54.917556 systemd[1824]: Reached target basic.target - Basic System. Jan 15 00:21:54.917716 systemd[1824]: Reached target default.target - Main User Target. Jan 15 00:21:54.917787 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 15 00:21:54.917806 systemd[1824]: Startup finished in 143ms. Jan 15 00:21:54.920516 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 15 00:21:55.237061 systemd[1]: Started sshd@1-10.0.3.29:22-20.161.92.111:48356.service - OpenSSH per-connection server daemon (20.161.92.111:48356). Jan 15 00:21:55.789222 sshd[1837]: Accepted publickey for core from 20.161.92.111 port 48356 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:21:55.790497 sshd-session[1837]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:21:55.794892 systemd-logind[1681]: New session 2 of user core. Jan 15 00:21:55.802377 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 15 00:21:56.092217 sshd[1840]: Connection closed by 20.161.92.111 port 48356 Jan 15 00:21:56.091826 sshd-session[1837]: pam_unix(sshd:session): session closed for user core Jan 15 00:21:56.095956 systemd[1]: sshd@1-10.0.3.29:22-20.161.92.111:48356.service: Deactivated successfully. Jan 15 00:21:56.099726 systemd[1]: session-2.scope: Deactivated successfully. Jan 15 00:21:56.100550 systemd-logind[1681]: Session 2 logged out. Waiting for processes to exit. Jan 15 00:21:56.101493 systemd-logind[1681]: Removed session 2. Jan 15 00:21:56.149203 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:56.158485 coreos-metadata[1665]: Jan 15 00:21:56.158 WARN failed to locate config-drive, using the metadata service API instead Jan 15 00:21:56.176034 coreos-metadata[1665]: Jan 15 00:21:56.175 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 15 00:21:56.212149 systemd[1]: Started sshd@2-10.0.3.29:22-20.161.92.111:48370.service - OpenSSH per-connection server daemon (20.161.92.111:48370). Jan 15 00:21:56.445704 coreos-metadata[1665]: Jan 15 00:21:56.445 INFO Fetch successful Jan 15 00:21:56.446000 coreos-metadata[1665]: Jan 15 00:21:56.445 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 00:21:56.471268 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 00:21:56.479548 coreos-metadata[1753]: Jan 15 00:21:56.479 WARN failed to locate config-drive, using the metadata service API instead Jan 15 00:21:56.492132 coreos-metadata[1753]: Jan 15 00:21:56.492 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 15 00:21:56.725887 coreos-metadata[1665]: Jan 15 00:21:56.725 INFO Fetch successful Jan 15 00:21:56.725887 coreos-metadata[1665]: Jan 15 00:21:56.725 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 15 00:21:56.729251 coreos-metadata[1753]: Jan 15 00:21:56.729 INFO Fetch successful Jan 15 00:21:56.729251 coreos-metadata[1753]: Jan 15 00:21:56.729 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 15 00:21:56.742212 sshd[1848]: Accepted publickey for core from 20.161.92.111 port 48370 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:21:56.743495 sshd-session[1848]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:21:56.748080 systemd-logind[1681]: New session 3 of user core. Jan 15 00:21:56.758924 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 15 00:21:57.002988 coreos-metadata[1665]: Jan 15 00:21:57.002 INFO Fetch successful Jan 15 00:21:57.002988 coreos-metadata[1665]: Jan 15 00:21:57.002 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 15 00:21:57.005205 coreos-metadata[1753]: Jan 15 00:21:57.005 INFO Fetch successful Jan 15 00:21:57.007036 unknown[1753]: wrote ssh authorized keys file for user: core Jan 15 00:21:57.038375 update-ssh-keys[1859]: Updated "/home/core/.ssh/authorized_keys" Jan 15 00:21:57.039483 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 15 00:21:57.041318 systemd[1]: Finished sshkeys.service. Jan 15 00:21:57.045444 sshd[1857]: Connection closed by 20.161.92.111 port 48370 Jan 15 00:21:57.045919 sshd-session[1848]: pam_unix(sshd:session): session closed for user core Jan 15 00:21:57.049710 systemd[1]: sshd@2-10.0.3.29:22-20.161.92.111:48370.service: Deactivated successfully. Jan 15 00:21:57.051390 systemd[1]: session-3.scope: Deactivated successfully. Jan 15 00:21:57.052082 systemd-logind[1681]: Session 3 logged out. Waiting for processes to exit. Jan 15 00:21:57.053227 systemd-logind[1681]: Removed session 3. Jan 15 00:21:57.150811 coreos-metadata[1665]: Jan 15 00:21:57.150 INFO Fetch successful Jan 15 00:21:57.150811 coreos-metadata[1665]: Jan 15 00:21:57.150 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 15 00:21:57.292928 coreos-metadata[1665]: Jan 15 00:21:57.292 INFO Fetch successful Jan 15 00:21:57.292928 coreos-metadata[1665]: Jan 15 00:21:57.292 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 15 00:21:57.433108 coreos-metadata[1665]: Jan 15 00:21:57.433 INFO Fetch successful Jan 15 00:21:57.470519 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 15 00:21:57.470999 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 15 00:21:57.471143 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 15 00:21:57.474238 systemd[1]: Startup finished in 2.779s (kernel) + 14.423s (initrd) + 10.940s (userspace) = 28.144s. Jan 15 00:22:01.770000 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 15 00:22:01.771560 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 00:22:01.917398 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 00:22:01.921160 (kubelet)[1879]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 00:22:01.964659 kubelet[1879]: E0115 00:22:01.964602 1879 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 00:22:01.967808 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 00:22:01.967945 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 00:22:01.969318 systemd[1]: kubelet.service: Consumed 156ms CPU time, 107.9M memory peak. Jan 15 00:22:07.162105 systemd[1]: Started sshd@3-10.0.3.29:22-20.161.92.111:37662.service - OpenSSH per-connection server daemon (20.161.92.111:37662). Jan 15 00:22:07.689305 sshd[1888]: Accepted publickey for core from 20.161.92.111 port 37662 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:22:07.690563 sshd-session[1888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:22:07.695531 systemd-logind[1681]: New session 4 of user core. Jan 15 00:22:07.701383 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 15 00:22:07.982870 sshd[1891]: Connection closed by 20.161.92.111 port 37662 Jan 15 00:22:07.983356 sshd-session[1888]: pam_unix(sshd:session): session closed for user core Jan 15 00:22:07.986548 systemd[1]: sshd@3-10.0.3.29:22-20.161.92.111:37662.service: Deactivated successfully. Jan 15 00:22:07.988349 systemd[1]: session-4.scope: Deactivated successfully. Jan 15 00:22:07.990407 systemd-logind[1681]: Session 4 logged out. Waiting for processes to exit. Jan 15 00:22:07.991320 systemd-logind[1681]: Removed session 4. Jan 15 00:22:08.096912 systemd[1]: Started sshd@4-10.0.3.29:22-20.161.92.111:37678.service - OpenSSH per-connection server daemon (20.161.92.111:37678). Jan 15 00:22:08.627075 sshd[1897]: Accepted publickey for core from 20.161.92.111 port 37678 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:22:08.628353 sshd-session[1897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:22:08.633323 systemd-logind[1681]: New session 5 of user core. Jan 15 00:22:08.647421 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 15 00:22:08.926517 sshd[1900]: Connection closed by 20.161.92.111 port 37678 Jan 15 00:22:08.926957 sshd-session[1897]: pam_unix(sshd:session): session closed for user core Jan 15 00:22:08.930152 systemd[1]: sshd@4-10.0.3.29:22-20.161.92.111:37678.service: Deactivated successfully. Jan 15 00:22:08.931755 systemd[1]: session-5.scope: Deactivated successfully. Jan 15 00:22:08.934763 systemd-logind[1681]: Session 5 logged out. Waiting for processes to exit. Jan 15 00:22:08.935629 systemd-logind[1681]: Removed session 5. Jan 15 00:22:09.037869 systemd[1]: Started sshd@5-10.0.3.29:22-20.161.92.111:37688.service - OpenSSH per-connection server daemon (20.161.92.111:37688). Jan 15 00:22:09.559261 sshd[1906]: Accepted publickey for core from 20.161.92.111 port 37688 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:22:09.560568 sshd-session[1906]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:22:09.565802 systemd-logind[1681]: New session 6 of user core. Jan 15 00:22:09.576383 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 15 00:22:09.852539 sshd[1909]: Connection closed by 20.161.92.111 port 37688 Jan 15 00:22:09.853346 sshd-session[1906]: pam_unix(sshd:session): session closed for user core Jan 15 00:22:09.857058 systemd[1]: sshd@5-10.0.3.29:22-20.161.92.111:37688.service: Deactivated successfully. Jan 15 00:22:09.858707 systemd[1]: session-6.scope: Deactivated successfully. Jan 15 00:22:09.859562 systemd-logind[1681]: Session 6 logged out. Waiting for processes to exit. Jan 15 00:22:09.860840 systemd-logind[1681]: Removed session 6. Jan 15 00:22:09.964896 systemd[1]: Started sshd@6-10.0.3.29:22-20.161.92.111:37698.service - OpenSSH per-connection server daemon (20.161.92.111:37698). Jan 15 00:22:10.500235 sshd[1915]: Accepted publickey for core from 20.161.92.111 port 37698 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:22:10.500824 sshd-session[1915]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:22:10.505795 systemd-logind[1681]: New session 7 of user core. Jan 15 00:22:10.519461 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 15 00:22:10.713750 sudo[1919]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 15 00:22:10.714021 sudo[1919]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 00:22:10.742585 sudo[1919]: pam_unix(sudo:session): session closed for user root Jan 15 00:22:10.839669 sshd[1918]: Connection closed by 20.161.92.111 port 37698 Jan 15 00:22:10.840240 sshd-session[1915]: pam_unix(sshd:session): session closed for user core Jan 15 00:22:10.844420 systemd[1]: sshd@6-10.0.3.29:22-20.161.92.111:37698.service: Deactivated successfully. Jan 15 00:22:10.846315 systemd[1]: session-7.scope: Deactivated successfully. Jan 15 00:22:10.847750 systemd-logind[1681]: Session 7 logged out. Waiting for processes to exit. Jan 15 00:22:10.848716 systemd-logind[1681]: Removed session 7. Jan 15 00:22:10.951013 systemd[1]: Started sshd@7-10.0.3.29:22-20.161.92.111:37704.service - OpenSSH per-connection server daemon (20.161.92.111:37704). Jan 15 00:22:11.469392 sshd[1925]: Accepted publickey for core from 20.161.92.111 port 37704 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:22:11.470648 sshd-session[1925]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:22:11.475324 systemd-logind[1681]: New session 8 of user core. Jan 15 00:22:11.493421 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 15 00:22:11.668770 sudo[1930]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 15 00:22:11.669037 sudo[1930]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 00:22:11.673736 sudo[1930]: pam_unix(sudo:session): session closed for user root Jan 15 00:22:11.680066 sudo[1929]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 15 00:22:11.680358 sudo[1929]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 00:22:11.689304 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 15 00:22:11.722000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 15 00:22:11.723550 augenrules[1952]: No rules Jan 15 00:22:11.725440 kernel: kauditd_printk_skb: 192 callbacks suppressed Jan 15 00:22:11.725514 kernel: audit: type=1305 audit(1768436531.722:239): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 15 00:22:11.722000 audit[1952]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffd2157f30 a2=420 a3=0 items=0 ppid=1933 pid=1952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:11.726618 systemd[1]: audit-rules.service: Deactivated successfully. Jan 15 00:22:11.726935 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 15 00:22:11.728370 sudo[1929]: pam_unix(sudo:session): session closed for user root Jan 15 00:22:11.729664 kernel: audit: type=1300 audit(1768436531.722:239): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffd2157f30 a2=420 a3=0 items=0 ppid=1933 pid=1952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:11.722000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 15 00:22:11.731670 kernel: audit: type=1327 audit(1768436531.722:239): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 15 00:22:11.731707 kernel: audit: type=1130 audit(1768436531.725:240): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.725000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.725000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.736600 kernel: audit: type=1131 audit(1768436531.725:241): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.736653 kernel: audit: type=1106 audit(1768436531.725:242): pid=1929 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.725000 audit[1929]: USER_END pid=1929 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.725000 audit[1929]: CRED_DISP pid=1929 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.741964 kernel: audit: type=1104 audit(1768436531.725:243): pid=1929 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.825738 sshd[1928]: Connection closed by 20.161.92.111 port 37704 Jan 15 00:22:11.826094 sshd-session[1925]: pam_unix(sshd:session): session closed for user core Jan 15 00:22:11.827000 audit[1925]: USER_END pid=1925 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:11.830244 systemd[1]: sshd@7-10.0.3.29:22-20.161.92.111:37704.service: Deactivated successfully. Jan 15 00:22:11.827000 audit[1925]: CRED_DISP pid=1925 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:11.832281 systemd[1]: session-8.scope: Deactivated successfully. Jan 15 00:22:11.836105 kernel: audit: type=1106 audit(1768436531.827:244): pid=1925 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:11.836169 kernel: audit: type=1104 audit(1768436531.827:245): pid=1925 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:11.836220 systemd-logind[1681]: Session 8 logged out. Waiting for processes to exit. Jan 15 00:22:11.830000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.3.29:22-20.161.92.111:37704 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.839357 kernel: audit: type=1131 audit(1768436531.830:246): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.3.29:22-20.161.92.111:37704 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.839782 systemd-logind[1681]: Removed session 8. Jan 15 00:22:11.940000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.3.29:22-20.161.92.111:37720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:11.940266 systemd[1]: Started sshd@8-10.0.3.29:22-20.161.92.111:37720.service - OpenSSH per-connection server daemon (20.161.92.111:37720). Jan 15 00:22:12.020359 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 15 00:22:12.022197 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 00:22:12.153563 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 00:22:12.153000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:12.158202 (kubelet)[1972]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 00:22:12.191832 kubelet[1972]: E0115 00:22:12.191763 1972 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 00:22:12.194294 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 00:22:12.194431 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 00:22:12.194000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 00:22:12.194819 systemd[1]: kubelet.service: Consumed 142ms CPU time, 106.9M memory peak. Jan 15 00:22:12.470000 audit[1961]: USER_ACCT pid=1961 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:12.471337 sshd[1961]: Accepted publickey for core from 20.161.92.111 port 37720 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:22:12.472000 audit[1961]: CRED_ACQ pid=1961 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:12.472000 audit[1961]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc5492180 a2=3 a3=0 items=0 ppid=1 pid=1961 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:12.472000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:22:12.472634 sshd-session[1961]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:22:12.476760 systemd-logind[1681]: New session 9 of user core. Jan 15 00:22:12.484505 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 15 00:22:12.486000 audit[1961]: USER_START pid=1961 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:12.487000 audit[1980]: CRED_ACQ pid=1980 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:12.675916 sudo[1981]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 15 00:22:12.675000 audit[1981]: USER_ACCT pid=1981 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:12.675000 audit[1981]: CRED_REFR pid=1981 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:12.676214 sudo[1981]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 00:22:12.678000 audit[1981]: USER_START pid=1981 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:12.947804 chronyd[1663]: Selected source PHC0 Jan 15 00:22:13.024093 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 15 00:22:13.044608 (dockerd)[2001]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 15 00:22:13.305165 dockerd[2001]: time="2026-01-15T00:22:13.305090202Z" level=info msg="Starting up" Jan 15 00:22:13.307146 dockerd[2001]: time="2026-01-15T00:22:13.307102955Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 15 00:22:13.318404 dockerd[2001]: time="2026-01-15T00:22:13.318356656Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 15 00:22:13.357588 dockerd[2001]: time="2026-01-15T00:22:13.357391919Z" level=info msg="Loading containers: start." Jan 15 00:22:13.367275 kernel: Initializing XFRM netlink socket Jan 15 00:22:13.422000 audit[2052]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.422000 audit[2052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffdbe0d4b0 a2=0 a3=0 items=0 ppid=2001 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.422000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 15 00:22:13.424000 audit[2054]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.424000 audit[2054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=fffff02cabb0 a2=0 a3=0 items=0 ppid=2001 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.424000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 15 00:22:13.426000 audit[2056]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.426000 audit[2056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdc5bbd30 a2=0 a3=0 items=0 ppid=2001 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.426000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 15 00:22:13.429000 audit[2058]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2058 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.429000 audit[2058]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd24e7090 a2=0 a3=0 items=0 ppid=2001 pid=2058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.429000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 15 00:22:13.431000 audit[2060]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2060 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.431000 audit[2060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff748dd00 a2=0 a3=0 items=0 ppid=2001 pid=2060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.431000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 15 00:22:13.433000 audit[2062]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2062 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.433000 audit[2062]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffedfb0eb0 a2=0 a3=0 items=0 ppid=2001 pid=2062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.433000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 00:22:13.435000 audit[2064]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2064 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.435000 audit[2064]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffffb504370 a2=0 a3=0 items=0 ppid=2001 pid=2064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.435000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 15 00:22:13.437000 audit[2066]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2066 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.437000 audit[2066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffc15967a0 a2=0 a3=0 items=0 ppid=2001 pid=2066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.437000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 15 00:22:13.468000 audit[2069]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2069 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.468000 audit[2069]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffd8324190 a2=0 a3=0 items=0 ppid=2001 pid=2069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.468000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 15 00:22:13.470000 audit[2071]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2071 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.470000 audit[2071]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff922f7c0 a2=0 a3=0 items=0 ppid=2001 pid=2071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.470000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 15 00:22:13.471000 audit[2073]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2073 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.471000 audit[2073]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffcdc41e60 a2=0 a3=0 items=0 ppid=2001 pid=2073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.471000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 15 00:22:13.473000 audit[2075]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2075 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.473000 audit[2075]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffde9df740 a2=0 a3=0 items=0 ppid=2001 pid=2075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.473000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 00:22:13.475000 audit[2077]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2077 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.475000 audit[2077]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=fffff2856670 a2=0 a3=0 items=0 ppid=2001 pid=2077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.475000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 15 00:22:13.517000 audit[2107]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2107 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.517000 audit[2107]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffe419d4c0 a2=0 a3=0 items=0 ppid=2001 pid=2107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.517000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 15 00:22:13.520000 audit[2109]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.520000 audit[2109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffc9690c10 a2=0 a3=0 items=0 ppid=2001 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.520000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 15 00:22:13.522000 audit[2111]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2111 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.522000 audit[2111]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe1327300 a2=0 a3=0 items=0 ppid=2001 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.522000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 15 00:22:13.523000 audit[2113]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2113 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.523000 audit[2113]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcab63370 a2=0 a3=0 items=0 ppid=2001 pid=2113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.523000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 15 00:22:13.525000 audit[2115]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2115 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.525000 audit[2115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc70bee90 a2=0 a3=0 items=0 ppid=2001 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.525000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 15 00:22:13.527000 audit[2117]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2117 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.527000 audit[2117]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffde815f70 a2=0 a3=0 items=0 ppid=2001 pid=2117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.527000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 00:22:13.529000 audit[2119]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2119 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.529000 audit[2119]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffde26bea0 a2=0 a3=0 items=0 ppid=2001 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.529000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 15 00:22:13.533000 audit[2121]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.533000 audit[2121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=fffff8585810 a2=0 a3=0 items=0 ppid=2001 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.533000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 15 00:22:13.536000 audit[2123]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.536000 audit[2123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffe4306ce0 a2=0 a3=0 items=0 ppid=2001 pid=2123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.536000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 15 00:22:13.537000 audit[2125]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2125 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.537000 audit[2125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe911f570 a2=0 a3=0 items=0 ppid=2001 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.537000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 15 00:22:13.539000 audit[2127]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.539000 audit[2127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffd6654370 a2=0 a3=0 items=0 ppid=2001 pid=2127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.539000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 15 00:22:13.541000 audit[2129]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2129 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.541000 audit[2129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffca6e6210 a2=0 a3=0 items=0 ppid=2001 pid=2129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.541000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 00:22:13.543000 audit[2131]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2131 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.543000 audit[2131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffc1f6a550 a2=0 a3=0 items=0 ppid=2001 pid=2131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.543000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 15 00:22:13.549000 audit[2136]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2136 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.549000 audit[2136]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc76c1020 a2=0 a3=0 items=0 ppid=2001 pid=2136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.549000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 15 00:22:13.552000 audit[2138]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.552000 audit[2138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffd497d920 a2=0 a3=0 items=0 ppid=2001 pid=2138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.552000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 15 00:22:13.554000 audit[2140]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.554000 audit[2140]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffea3524d0 a2=0 a3=0 items=0 ppid=2001 pid=2140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.554000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 15 00:22:13.556000 audit[2142]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2142 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.556000 audit[2142]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd73d2010 a2=0 a3=0 items=0 ppid=2001 pid=2142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.556000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 15 00:22:13.559000 audit[2144]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2144 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.559000 audit[2144]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=fffff2aae750 a2=0 a3=0 items=0 ppid=2001 pid=2144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.559000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 15 00:22:13.561000 audit[2146]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2146 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:13.561000 audit[2146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffe2fcb790 a2=0 a3=0 items=0 ppid=2001 pid=2146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.561000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 15 00:22:13.579000 audit[2151]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.579000 audit[2151]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffc1e7d7b0 a2=0 a3=0 items=0 ppid=2001 pid=2151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.579000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 15 00:22:13.581000 audit[2153]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2153 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.581000 audit[2153]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffe06c94f0 a2=0 a3=0 items=0 ppid=2001 pid=2153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.581000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 15 00:22:13.590000 audit[2161]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2161 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.590000 audit[2161]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffed6dfc90 a2=0 a3=0 items=0 ppid=2001 pid=2161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.590000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 15 00:22:13.600000 audit[2167]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.600000 audit[2167]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffd916f320 a2=0 a3=0 items=0 ppid=2001 pid=2167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.600000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 15 00:22:13.603000 audit[2169]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2169 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.603000 audit[2169]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffeb3d71f0 a2=0 a3=0 items=0 ppid=2001 pid=2169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.603000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 15 00:22:13.605000 audit[2171]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2171 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.605000 audit[2171]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe32077b0 a2=0 a3=0 items=0 ppid=2001 pid=2171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.605000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 15 00:22:13.607000 audit[2173]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2173 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.607000 audit[2173]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffefac1150 a2=0 a3=0 items=0 ppid=2001 pid=2173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.607000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 15 00:22:13.609000 audit[2175]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2175 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:13.609000 audit[2175]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe16af630 a2=0 a3=0 items=0 ppid=2001 pid=2175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:13.609000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 15 00:22:13.611190 systemd-networkd[1612]: docker0: Link UP Jan 15 00:22:13.616064 dockerd[2001]: time="2026-01-15T00:22:13.615985547Z" level=info msg="Loading containers: done." Jan 15 00:22:13.629500 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4034217237-merged.mount: Deactivated successfully. Jan 15 00:22:13.641961 dockerd[2001]: time="2026-01-15T00:22:13.641540591Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 15 00:22:13.641961 dockerd[2001]: time="2026-01-15T00:22:13.641629208Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 15 00:22:13.641961 dockerd[2001]: time="2026-01-15T00:22:13.641805186Z" level=info msg="Initializing buildkit" Jan 15 00:22:13.665600 dockerd[2001]: time="2026-01-15T00:22:13.665562152Z" level=info msg="Completed buildkit initialization" Jan 15 00:22:13.670766 dockerd[2001]: time="2026-01-15T00:22:13.670725811Z" level=info msg="Daemon has completed initialization" Jan 15 00:22:13.671165 dockerd[2001]: time="2026-01-15T00:22:13.670988455Z" level=info msg="API listen on /run/docker.sock" Jan 15 00:22:13.671185 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 15 00:22:13.670000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:14.819070 containerd[1703]: time="2026-01-15T00:22:14.819018456Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 15 00:22:15.579190 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4097555899.mount: Deactivated successfully. Jan 15 00:22:16.745846 containerd[1703]: time="2026-01-15T00:22:16.745777576Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:16.746775 containerd[1703]: time="2026-01-15T00:22:16.746717999Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=24845792" Jan 15 00:22:16.748084 containerd[1703]: time="2026-01-15T00:22:16.748033560Z" level=info msg="ImageCreate event name:\"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:16.751277 containerd[1703]: time="2026-01-15T00:22:16.751228190Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:16.752445 containerd[1703]: time="2026-01-15T00:22:16.752398366Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"26438581\" in 1.933324573s" Jan 15 00:22:16.752445 containerd[1703]: time="2026-01-15T00:22:16.752438190Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\"" Jan 15 00:22:16.753491 containerd[1703]: time="2026-01-15T00:22:16.753241982Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 15 00:22:18.332651 containerd[1703]: time="2026-01-15T00:22:18.332591030Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:18.334286 containerd[1703]: time="2026-01-15T00:22:18.334166158Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=22613932" Jan 15 00:22:18.335540 containerd[1703]: time="2026-01-15T00:22:18.335495672Z" level=info msg="ImageCreate event name:\"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:18.338956 containerd[1703]: time="2026-01-15T00:22:18.338908616Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:18.340177 containerd[1703]: time="2026-01-15T00:22:18.339937829Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"24206567\" in 1.58666079s" Jan 15 00:22:18.340177 containerd[1703]: time="2026-01-15T00:22:18.339974446Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\"" Jan 15 00:22:18.340726 containerd[1703]: time="2026-01-15T00:22:18.340693218Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 15 00:22:19.890082 containerd[1703]: time="2026-01-15T00:22:19.889162157Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:19.890082 containerd[1703]: time="2026-01-15T00:22:19.890040719Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=17608611" Jan 15 00:22:19.890718 containerd[1703]: time="2026-01-15T00:22:19.890694122Z" level=info msg="ImageCreate event name:\"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:19.895158 containerd[1703]: time="2026-01-15T00:22:19.895111936Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:19.895715 containerd[1703]: time="2026-01-15T00:22:19.895678337Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"19201246\" in 1.554949489s" Jan 15 00:22:19.895715 containerd[1703]: time="2026-01-15T00:22:19.895710738Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\"" Jan 15 00:22:19.896256 containerd[1703]: time="2026-01-15T00:22:19.896231059Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 15 00:22:20.879709 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2638166224.mount: Deactivated successfully. Jan 15 00:22:21.110878 containerd[1703]: time="2026-01-15T00:22:21.110829661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:21.112399 containerd[1703]: time="2026-01-15T00:22:21.112341463Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=27555003" Jan 15 00:22:21.113522 containerd[1703]: time="2026-01-15T00:22:21.113478226Z" level=info msg="ImageCreate event name:\"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:21.115849 containerd[1703]: time="2026-01-15T00:22:21.115809950Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:21.116857 containerd[1703]: time="2026-01-15T00:22:21.116827632Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"27557743\" in 1.220563373s" Jan 15 00:22:21.116971 containerd[1703]: time="2026-01-15T00:22:21.116953912Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\"" Jan 15 00:22:21.117468 containerd[1703]: time="2026-01-15T00:22:21.117444233Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 15 00:22:21.746928 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3518963220.mount: Deactivated successfully. Jan 15 00:22:22.269943 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 15 00:22:22.272287 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 00:22:22.420503 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 00:22:22.419000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:22.421716 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 15 00:22:22.421782 kernel: audit: type=1130 audit(1768436542.419:299): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:22.425887 (kubelet)[2352]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 00:22:22.446465 containerd[1703]: time="2026-01-15T00:22:22.446414933Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:22.449502 containerd[1703]: time="2026-01-15T00:22:22.448417059Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=15956282" Jan 15 00:22:22.451298 containerd[1703]: time="2026-01-15T00:22:22.451264588Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:22.455102 containerd[1703]: time="2026-01-15T00:22:22.455071320Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:22.456150 containerd[1703]: time="2026-01-15T00:22:22.456124723Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.33864885s" Jan 15 00:22:22.456219 containerd[1703]: time="2026-01-15T00:22:22.456156003Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Jan 15 00:22:22.456687 containerd[1703]: time="2026-01-15T00:22:22.456649965Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 15 00:22:22.466790 kubelet[2352]: E0115 00:22:22.466746 2352 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 00:22:22.469318 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 00:22:22.469554 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 00:22:22.468000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 00:22:22.469990 systemd[1]: kubelet.service: Consumed 152ms CPU time, 107.8M memory peak. Jan 15 00:22:22.473210 kernel: audit: type=1131 audit(1768436542.468:300): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 00:22:22.941643 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3810632210.mount: Deactivated successfully. Jan 15 00:22:22.948229 containerd[1703]: time="2026-01-15T00:22:22.948150348Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 00:22:22.949147 containerd[1703]: time="2026-01-15T00:22:22.949034671Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 15 00:22:22.950357 containerd[1703]: time="2026-01-15T00:22:22.950290035Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 00:22:22.952971 containerd[1703]: time="2026-01-15T00:22:22.952916443Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 00:22:22.953877 containerd[1703]: time="2026-01-15T00:22:22.953835165Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 497.07592ms" Jan 15 00:22:22.953877 containerd[1703]: time="2026-01-15T00:22:22.953867646Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jan 15 00:22:22.954465 containerd[1703]: time="2026-01-15T00:22:22.954438887Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 15 00:22:23.516310 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2683635964.mount: Deactivated successfully. Jan 15 00:22:25.610847 containerd[1703]: time="2026-01-15T00:22:25.610799852Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:25.611880 containerd[1703]: time="2026-01-15T00:22:25.611597935Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67146608" Jan 15 00:22:25.612598 containerd[1703]: time="2026-01-15T00:22:25.612568538Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:25.615918 containerd[1703]: time="2026-01-15T00:22:25.615883868Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:25.617534 containerd[1703]: time="2026-01-15T00:22:25.617491633Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.663018146s" Jan 15 00:22:25.617972 containerd[1703]: time="2026-01-15T00:22:25.617929154Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Jan 15 00:22:32.361844 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 00:22:32.361000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:32.362126 systemd[1]: kubelet.service: Consumed 152ms CPU time, 107.8M memory peak. Jan 15 00:22:32.364055 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 00:22:32.361000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:32.368446 kernel: audit: type=1130 audit(1768436552.361:301): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:32.368542 kernel: audit: type=1131 audit(1768436552.361:302): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:32.392156 systemd[1]: Reload requested from client PID 2450 ('systemctl') (unit session-9.scope)... Jan 15 00:22:32.392213 systemd[1]: Reloading... Jan 15 00:22:32.479608 zram_generator::config[2499]: No configuration found. Jan 15 00:22:32.651379 systemd[1]: Reloading finished in 258 ms. Jan 15 00:22:32.676000 audit: BPF prog-id=63 op=LOAD Jan 15 00:22:32.676000 audit: BPF prog-id=52 op=UNLOAD Jan 15 00:22:32.680277 kernel: audit: type=1334 audit(1768436552.676:303): prog-id=63 op=LOAD Jan 15 00:22:32.680320 kernel: audit: type=1334 audit(1768436552.676:304): prog-id=52 op=UNLOAD Jan 15 00:22:32.680342 kernel: audit: type=1334 audit(1768436552.677:305): prog-id=64 op=LOAD Jan 15 00:22:32.677000 audit: BPF prog-id=64 op=LOAD Jan 15 00:22:32.677000 audit: BPF prog-id=65 op=LOAD Jan 15 00:22:32.677000 audit: BPF prog-id=53 op=UNLOAD Jan 15 00:22:32.683156 kernel: audit: type=1334 audit(1768436552.677:306): prog-id=65 op=LOAD Jan 15 00:22:32.683213 kernel: audit: type=1334 audit(1768436552.677:307): prog-id=53 op=UNLOAD Jan 15 00:22:32.683243 kernel: audit: type=1334 audit(1768436552.677:308): prog-id=54 op=UNLOAD Jan 15 00:22:32.677000 audit: BPF prog-id=54 op=UNLOAD Jan 15 00:22:32.684156 kernel: audit: type=1334 audit(1768436552.680:309): prog-id=66 op=LOAD Jan 15 00:22:32.680000 audit: BPF prog-id=66 op=LOAD Jan 15 00:22:32.680000 audit: BPF prog-id=58 op=UNLOAD Jan 15 00:22:32.681000 audit: BPF prog-id=67 op=LOAD Jan 15 00:22:32.681000 audit: BPF prog-id=59 op=UNLOAD Jan 15 00:22:32.682000 audit: BPF prog-id=68 op=LOAD Jan 15 00:22:32.686198 kernel: audit: type=1334 audit(1768436552.680:310): prog-id=58 op=UNLOAD Jan 15 00:22:32.688000 audit: BPF prog-id=49 op=UNLOAD Jan 15 00:22:32.688000 audit: BPF prog-id=69 op=LOAD Jan 15 00:22:32.688000 audit: BPF prog-id=70 op=LOAD Jan 15 00:22:32.688000 audit: BPF prog-id=50 op=UNLOAD Jan 15 00:22:32.688000 audit: BPF prog-id=51 op=UNLOAD Jan 15 00:22:32.688000 audit: BPF prog-id=71 op=LOAD Jan 15 00:22:32.688000 audit: BPF prog-id=55 op=UNLOAD Jan 15 00:22:32.689000 audit: BPF prog-id=72 op=LOAD Jan 15 00:22:32.689000 audit: BPF prog-id=73 op=LOAD Jan 15 00:22:32.689000 audit: BPF prog-id=56 op=UNLOAD Jan 15 00:22:32.689000 audit: BPF prog-id=57 op=UNLOAD Jan 15 00:22:32.689000 audit: BPF prog-id=74 op=LOAD Jan 15 00:22:32.689000 audit: BPF prog-id=75 op=LOAD Jan 15 00:22:32.689000 audit: BPF prog-id=43 op=UNLOAD Jan 15 00:22:32.689000 audit: BPF prog-id=44 op=UNLOAD Jan 15 00:22:32.690000 audit: BPF prog-id=76 op=LOAD Jan 15 00:22:32.690000 audit: BPF prog-id=48 op=UNLOAD Jan 15 00:22:32.691000 audit: BPF prog-id=77 op=LOAD Jan 15 00:22:32.691000 audit: BPF prog-id=45 op=UNLOAD Jan 15 00:22:32.691000 audit: BPF prog-id=78 op=LOAD Jan 15 00:22:32.691000 audit: BPF prog-id=79 op=LOAD Jan 15 00:22:32.691000 audit: BPF prog-id=46 op=UNLOAD Jan 15 00:22:32.691000 audit: BPF prog-id=47 op=UNLOAD Jan 15 00:22:32.692000 audit: BPF prog-id=80 op=LOAD Jan 15 00:22:32.692000 audit: BPF prog-id=60 op=UNLOAD Jan 15 00:22:32.692000 audit: BPF prog-id=81 op=LOAD Jan 15 00:22:32.692000 audit: BPF prog-id=82 op=LOAD Jan 15 00:22:32.692000 audit: BPF prog-id=61 op=UNLOAD Jan 15 00:22:32.692000 audit: BPF prog-id=62 op=UNLOAD Jan 15 00:22:32.706913 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 15 00:22:32.706993 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 15 00:22:32.707326 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 00:22:32.707000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 00:22:32.707385 systemd[1]: kubelet.service: Consumed 96ms CPU time, 95.1M memory peak. Jan 15 00:22:32.710462 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 00:22:32.848000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:32.848802 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 00:22:32.853845 (kubelet)[2544]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 00:22:32.892284 kubelet[2544]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 00:22:32.892284 kubelet[2544]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 15 00:22:32.892284 kubelet[2544]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 00:22:32.892614 kubelet[2544]: I0115 00:22:32.892332 2544 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 00:22:33.699284 kubelet[2544]: I0115 00:22:33.699241 2544 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 15 00:22:33.699284 kubelet[2544]: I0115 00:22:33.699277 2544 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 00:22:33.699592 kubelet[2544]: I0115 00:22:33.699556 2544 server.go:954] "Client rotation is on, will bootstrap in background" Jan 15 00:22:33.731861 kubelet[2544]: E0115 00:22:33.731819 2544 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.3.29:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.3.29:6443: connect: connection refused" logger="UnhandledError" Jan 15 00:22:33.737747 kubelet[2544]: I0115 00:22:33.737713 2544 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 00:22:33.745971 kubelet[2544]: I0115 00:22:33.745927 2544 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 15 00:22:33.749217 kubelet[2544]: I0115 00:22:33.748954 2544 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 00:22:33.749318 kubelet[2544]: I0115 00:22:33.749239 2544 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 00:22:33.749446 kubelet[2544]: I0115 00:22:33.749268 2544 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-n-1ddc109f0f","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 15 00:22:33.749543 kubelet[2544]: I0115 00:22:33.749537 2544 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 00:22:33.749566 kubelet[2544]: I0115 00:22:33.749546 2544 container_manager_linux.go:304] "Creating device plugin manager" Jan 15 00:22:33.749781 kubelet[2544]: I0115 00:22:33.749764 2544 state_mem.go:36] "Initialized new in-memory state store" Jan 15 00:22:33.753593 kubelet[2544]: I0115 00:22:33.753565 2544 kubelet.go:446] "Attempting to sync node with API server" Jan 15 00:22:33.753593 kubelet[2544]: I0115 00:22:33.753593 2544 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 00:22:33.753655 kubelet[2544]: I0115 00:22:33.753618 2544 kubelet.go:352] "Adding apiserver pod source" Jan 15 00:22:33.753655 kubelet[2544]: I0115 00:22:33.753628 2544 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 00:22:33.756323 kubelet[2544]: W0115 00:22:33.756203 2544 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.3.29:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-1ddc109f0f&limit=500&resourceVersion=0": dial tcp 10.0.3.29:6443: connect: connection refused Jan 15 00:22:33.756323 kubelet[2544]: E0115 00:22:33.756274 2544 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.3.29:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-1ddc109f0f&limit=500&resourceVersion=0\": dial tcp 10.0.3.29:6443: connect: connection refused" logger="UnhandledError" Jan 15 00:22:33.757928 kubelet[2544]: I0115 00:22:33.757373 2544 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 15 00:22:33.757928 kubelet[2544]: W0115 00:22:33.757526 2544 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.3.29:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.3.29:6443: connect: connection refused Jan 15 00:22:33.757928 kubelet[2544]: E0115 00:22:33.757570 2544 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.3.29:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.3.29:6443: connect: connection refused" logger="UnhandledError" Jan 15 00:22:33.758115 kubelet[2544]: I0115 00:22:33.758088 2544 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 15 00:22:33.758291 kubelet[2544]: W0115 00:22:33.758276 2544 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 15 00:22:33.760191 kubelet[2544]: I0115 00:22:33.759608 2544 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 15 00:22:33.760191 kubelet[2544]: I0115 00:22:33.759654 2544 server.go:1287] "Started kubelet" Jan 15 00:22:33.760191 kubelet[2544]: I0115 00:22:33.759723 2544 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 00:22:33.761016 kubelet[2544]: I0115 00:22:33.760972 2544 server.go:479] "Adding debug handlers to kubelet server" Jan 15 00:22:33.765153 kubelet[2544]: I0115 00:22:33.765121 2544 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 00:22:33.766743 kubelet[2544]: I0115 00:22:33.766617 2544 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 00:22:33.766886 kubelet[2544]: I0115 00:22:33.766861 2544 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 00:22:33.766990 kubelet[2544]: E0115 00:22:33.766693 2544 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.3.29:6443/api/v1/namespaces/default/events\": dial tcp 10.0.3.29:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4515-1-0-n-1ddc109f0f.188abfad12a88e33 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4515-1-0-n-1ddc109f0f,UID:ci-4515-1-0-n-1ddc109f0f,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-n-1ddc109f0f,},FirstTimestamp:2026-01-15 00:22:33.759624755 +0000 UTC m=+0.902658965,LastTimestamp:2026-01-15 00:22:33.759624755 +0000 UTC m=+0.902658965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-n-1ddc109f0f,}" Jan 15 00:22:33.767077 kubelet[2544]: E0115 00:22:33.767051 2544 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" Jan 15 00:22:33.767115 kubelet[2544]: I0115 00:22:33.767088 2544 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 15 00:22:33.767347 kubelet[2544]: I0115 00:22:33.767305 2544 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 15 00:22:33.767401 kubelet[2544]: I0115 00:22:33.767386 2544 reconciler.go:26] "Reconciler: start to sync state" Jan 15 00:22:33.768431 kubelet[2544]: E0115 00:22:33.768397 2544 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.29:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-1ddc109f0f?timeout=10s\": dial tcp 10.0.3.29:6443: connect: connection refused" interval="200ms" Jan 15 00:22:33.768000 audit[2557]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2557 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:33.768000 audit[2557]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc93c0650 a2=0 a3=0 items=0 ppid=2544 pid=2557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.768000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 15 00:22:33.769545 kubelet[2544]: W0115 00:22:33.768401 2544 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.3.29:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.3.29:6443: connect: connection refused Jan 15 00:22:33.769545 kubelet[2544]: E0115 00:22:33.769307 2544 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.3.29:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.3.29:6443: connect: connection refused" logger="UnhandledError" Jan 15 00:22:33.769545 kubelet[2544]: I0115 00:22:33.768556 2544 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 15 00:22:33.769000 audit[2558]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2558 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:33.769000 audit[2558]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcd6d99b0 a2=0 a3=0 items=0 ppid=2544 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.769000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 15 00:22:33.770085 kubelet[2544]: I0115 00:22:33.769996 2544 factory.go:221] Registration of the systemd container factory successfully Jan 15 00:22:33.770162 kubelet[2544]: I0115 00:22:33.770127 2544 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 00:22:33.771578 kubelet[2544]: E0115 00:22:33.771555 2544 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 15 00:22:33.772087 kubelet[2544]: I0115 00:22:33.772063 2544 factory.go:221] Registration of the containerd container factory successfully Jan 15 00:22:33.772000 audit[2560]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2560 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:33.772000 audit[2560]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffd5e75bc0 a2=0 a3=0 items=0 ppid=2544 pid=2560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.772000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 00:22:33.774000 audit[2562]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2562 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:33.774000 audit[2562]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe3de2b40 a2=0 a3=0 items=0 ppid=2544 pid=2562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.774000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 00:22:33.786891 kubelet[2544]: I0115 00:22:33.786863 2544 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 15 00:22:33.786000 audit[2567]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2567 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:33.786000 audit[2567]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=fffff1286c60 a2=0 a3=0 items=0 ppid=2544 pid=2567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.786000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 15 00:22:33.787252 kubelet[2544]: I0115 00:22:33.787197 2544 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 15 00:22:33.787252 kubelet[2544]: I0115 00:22:33.787220 2544 state_mem.go:36] "Initialized new in-memory state store" Jan 15 00:22:33.787333 kubelet[2544]: I0115 00:22:33.787109 2544 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 15 00:22:33.787000 audit[2568]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2568 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:33.787000 audit[2568]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=fffffba4c790 a2=0 a3=0 items=0 ppid=2544 pid=2568 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.787000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 15 00:22:33.788000 audit[2569]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2569 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:33.788000 audit[2569]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe7fe81e0 a2=0 a3=0 items=0 ppid=2544 pid=2569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.788000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 15 00:22:33.789321 kubelet[2544]: I0115 00:22:33.788399 2544 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 15 00:22:33.789321 kubelet[2544]: I0115 00:22:33.788431 2544 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 15 00:22:33.789321 kubelet[2544]: I0115 00:22:33.788450 2544 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 15 00:22:33.789321 kubelet[2544]: I0115 00:22:33.788456 2544 kubelet.go:2382] "Starting kubelet main sync loop" Jan 15 00:22:33.789321 kubelet[2544]: E0115 00:22:33.788497 2544 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 00:22:33.789321 kubelet[2544]: W0115 00:22:33.788955 2544 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.3.29:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.3.29:6443: connect: connection refused Jan 15 00:22:33.789321 kubelet[2544]: E0115 00:22:33.788999 2544 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.3.29:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.3.29:6443: connect: connection refused" logger="UnhandledError" Jan 15 00:22:33.789000 audit[2571]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2571 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:33.789000 audit[2571]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff39eddd0 a2=0 a3=0 items=0 ppid=2544 pid=2571 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.789000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 15 00:22:33.789000 audit[2572]: NETFILTER_CFG table=nat:50 family=2 entries=1 op=nft_register_chain pid=2572 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:33.789000 audit[2572]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff0da6a20 a2=0 a3=0 items=0 ppid=2544 pid=2572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.789000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 15 00:22:33.790000 audit[2573]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_chain pid=2573 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:33.790000 audit[2573]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd52421e0 a2=0 a3=0 items=0 ppid=2544 pid=2573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.790000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 15 00:22:33.790000 audit[2574]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2574 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:33.790000 audit[2574]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd17e42d0 a2=0 a3=0 items=0 ppid=2544 pid=2574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.790000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 15 00:22:33.791000 audit[2575]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2575 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:33.791000 audit[2575]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc0fb0050 a2=0 a3=0 items=0 ppid=2544 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:33.791000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 15 00:22:33.793723 kubelet[2544]: I0115 00:22:33.793463 2544 policy_none.go:49] "None policy: Start" Jan 15 00:22:33.793723 kubelet[2544]: I0115 00:22:33.793486 2544 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 15 00:22:33.793723 kubelet[2544]: I0115 00:22:33.793499 2544 state_mem.go:35] "Initializing new in-memory state store" Jan 15 00:22:33.799305 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 15 00:22:33.811793 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 15 00:22:33.827994 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 15 00:22:33.829836 kubelet[2544]: I0115 00:22:33.829809 2544 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 15 00:22:33.830021 kubelet[2544]: I0115 00:22:33.830008 2544 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 15 00:22:33.830051 kubelet[2544]: I0115 00:22:33.830024 2544 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 00:22:33.830747 kubelet[2544]: I0115 00:22:33.830556 2544 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 00:22:33.831966 kubelet[2544]: E0115 00:22:33.831944 2544 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 15 00:22:33.832065 kubelet[2544]: E0115 00:22:33.832054 2544 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4515-1-0-n-1ddc109f0f\" not found" Jan 15 00:22:33.897362 systemd[1]: Created slice kubepods-burstable-pod26fb68362dc2a643d15360cc1b2791a3.slice - libcontainer container kubepods-burstable-pod26fb68362dc2a643d15360cc1b2791a3.slice. Jan 15 00:22:33.908089 kubelet[2544]: E0115 00:22:33.908061 2544 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:33.910822 systemd[1]: Created slice kubepods-burstable-pod8bc3aec537de54b0a44b57386bb39227.slice - libcontainer container kubepods-burstable-pod8bc3aec537de54b0a44b57386bb39227.slice. Jan 15 00:22:33.912502 kubelet[2544]: E0115 00:22:33.912476 2544 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:33.914872 systemd[1]: Created slice kubepods-burstable-pod921845c7fba3dc0759018a9a18178d42.slice - libcontainer container kubepods-burstable-pod921845c7fba3dc0759018a9a18178d42.slice. Jan 15 00:22:33.916484 kubelet[2544]: E0115 00:22:33.916453 2544 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:33.931938 kubelet[2544]: I0115 00:22:33.931916 2544 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:33.932384 kubelet[2544]: E0115 00:22:33.932359 2544 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.29:6443/api/v1/nodes\": dial tcp 10.0.3.29:6443: connect: connection refused" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:33.970149 kubelet[2544]: E0115 00:22:33.970044 2544 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.29:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-1ddc109f0f?timeout=10s\": dial tcp 10.0.3.29:6443: connect: connection refused" interval="400ms" Jan 15 00:22:34.068451 kubelet[2544]: I0115 00:22:34.068410 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.068451 kubelet[2544]: I0115 00:22:34.068457 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/26fb68362dc2a643d15360cc1b2791a3-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-n-1ddc109f0f\" (UID: \"26fb68362dc2a643d15360cc1b2791a3\") " pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.068716 kubelet[2544]: I0115 00:22:34.068475 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8bc3aec537de54b0a44b57386bb39227-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-n-1ddc109f0f\" (UID: \"8bc3aec537de54b0a44b57386bb39227\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.068716 kubelet[2544]: I0115 00:22:34.068490 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8bc3aec537de54b0a44b57386bb39227-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-n-1ddc109f0f\" (UID: \"8bc3aec537de54b0a44b57386bb39227\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.068716 kubelet[2544]: I0115 00:22:34.068508 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.068716 kubelet[2544]: I0115 00:22:34.068522 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.068716 kubelet[2544]: I0115 00:22:34.068536 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.068893 kubelet[2544]: I0115 00:22:34.068553 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8bc3aec537de54b0a44b57386bb39227-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-n-1ddc109f0f\" (UID: \"8bc3aec537de54b0a44b57386bb39227\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.068893 kubelet[2544]: I0115 00:22:34.068582 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.134881 kubelet[2544]: I0115 00:22:34.134859 2544 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.135209 kubelet[2544]: E0115 00:22:34.135186 2544 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.29:6443/api/v1/nodes\": dial tcp 10.0.3.29:6443: connect: connection refused" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.209817 containerd[1703]: time="2026-01-15T00:22:34.209634933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-n-1ddc109f0f,Uid:26fb68362dc2a643d15360cc1b2791a3,Namespace:kube-system,Attempt:0,}" Jan 15 00:22:34.213277 containerd[1703]: time="2026-01-15T00:22:34.213095904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-n-1ddc109f0f,Uid:8bc3aec537de54b0a44b57386bb39227,Namespace:kube-system,Attempt:0,}" Jan 15 00:22:34.217977 containerd[1703]: time="2026-01-15T00:22:34.217942839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-n-1ddc109f0f,Uid:921845c7fba3dc0759018a9a18178d42,Namespace:kube-system,Attempt:0,}" Jan 15 00:22:34.237051 containerd[1703]: time="2026-01-15T00:22:34.236923057Z" level=info msg="connecting to shim 0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b" address="unix:///run/containerd/s/446433a4535420d1034752d1284badbc2a2f0604c439763c17cae9ac6aa8e33c" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:22:34.255587 containerd[1703]: time="2026-01-15T00:22:34.255467554Z" level=info msg="connecting to shim 97b986f076ca85bd12656b191071cfe79b9479c2527fbb64cef116a95567d845" address="unix:///run/containerd/s/617abd4beee0bc770bcec76525259867c4899c110bc2fdbff0f1664b27dec940" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:22:34.259860 containerd[1703]: time="2026-01-15T00:22:34.259819767Z" level=info msg="connecting to shim 08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876" address="unix:///run/containerd/s/f69039bc65a3950919086b0f712abb117d343e353eded86d44169ead0c9e2bf2" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:22:34.274411 systemd[1]: Started cri-containerd-0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b.scope - libcontainer container 0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b. Jan 15 00:22:34.289509 systemd[1]: Started cri-containerd-08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876.scope - libcontainer container 08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876. Jan 15 00:22:34.290591 systemd[1]: Started cri-containerd-97b986f076ca85bd12656b191071cfe79b9479c2527fbb64cef116a95567d845.scope - libcontainer container 97b986f076ca85bd12656b191071cfe79b9479c2527fbb64cef116a95567d845. Jan 15 00:22:34.292000 audit: BPF prog-id=83 op=LOAD Jan 15 00:22:34.293000 audit: BPF prog-id=84 op=LOAD Jan 15 00:22:34.293000 audit[2603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40000fe180 a2=98 a3=0 items=0 ppid=2585 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063343534306430643133353131313261363864333866323139343132 Jan 15 00:22:34.293000 audit: BPF prog-id=84 op=UNLOAD Jan 15 00:22:34.293000 audit[2603]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063343534306430643133353131313261363864333866323139343132 Jan 15 00:22:34.293000 audit: BPF prog-id=85 op=LOAD Jan 15 00:22:34.293000 audit[2603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40000fe3e8 a2=98 a3=0 items=0 ppid=2585 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063343534306430643133353131313261363864333866323139343132 Jan 15 00:22:34.293000 audit: BPF prog-id=86 op=LOAD Jan 15 00:22:34.293000 audit[2603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40000fe168 a2=98 a3=0 items=0 ppid=2585 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063343534306430643133353131313261363864333866323139343132 Jan 15 00:22:34.293000 audit: BPF prog-id=86 op=UNLOAD Jan 15 00:22:34.293000 audit[2603]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063343534306430643133353131313261363864333866323139343132 Jan 15 00:22:34.293000 audit: BPF prog-id=85 op=UNLOAD Jan 15 00:22:34.293000 audit[2603]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063343534306430643133353131313261363864333866323139343132 Jan 15 00:22:34.293000 audit: BPF prog-id=87 op=LOAD Jan 15 00:22:34.293000 audit[2603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40000fe648 a2=98 a3=0 items=0 ppid=2585 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3063343534306430643133353131313261363864333866323139343132 Jan 15 00:22:34.301000 audit: BPF prog-id=88 op=LOAD Jan 15 00:22:34.302000 audit: BPF prog-id=89 op=LOAD Jan 15 00:22:34.302000 audit[2646]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=2605 pid=2646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937623938366630373663613835626431323635366231393130373163 Jan 15 00:22:34.302000 audit: BPF prog-id=89 op=UNLOAD Jan 15 00:22:34.302000 audit[2646]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937623938366630373663613835626431323635366231393130373163 Jan 15 00:22:34.302000 audit: BPF prog-id=90 op=LOAD Jan 15 00:22:34.302000 audit[2646]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=2605 pid=2646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937623938366630373663613835626431323635366231393130373163 Jan 15 00:22:34.302000 audit: BPF prog-id=91 op=LOAD Jan 15 00:22:34.302000 audit[2646]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=2605 pid=2646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937623938366630373663613835626431323635366231393130373163 Jan 15 00:22:34.302000 audit: BPF prog-id=91 op=UNLOAD Jan 15 00:22:34.302000 audit[2646]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937623938366630373663613835626431323635366231393130373163 Jan 15 00:22:34.302000 audit: BPF prog-id=90 op=UNLOAD Jan 15 00:22:34.302000 audit[2646]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937623938366630373663613835626431323635366231393130373163 Jan 15 00:22:34.302000 audit: BPF prog-id=92 op=LOAD Jan 15 00:22:34.302000 audit[2646]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=2605 pid=2646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937623938366630373663613835626431323635366231393130373163 Jan 15 00:22:34.303000 audit: BPF prog-id=93 op=LOAD Jan 15 00:22:34.304000 audit: BPF prog-id=94 op=LOAD Jan 15 00:22:34.304000 audit[2650]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2628 pid=2650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313833613931396537626136313863616537663839613132623435 Jan 15 00:22:34.304000 audit: BPF prog-id=94 op=UNLOAD Jan 15 00:22:34.304000 audit[2650]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2628 pid=2650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313833613931396537626136313863616537663839613132623435 Jan 15 00:22:34.304000 audit: BPF prog-id=95 op=LOAD Jan 15 00:22:34.304000 audit[2650]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2628 pid=2650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313833613931396537626136313863616537663839613132623435 Jan 15 00:22:34.304000 audit: BPF prog-id=96 op=LOAD Jan 15 00:22:34.304000 audit[2650]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2628 pid=2650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313833613931396537626136313863616537663839613132623435 Jan 15 00:22:34.304000 audit: BPF prog-id=96 op=UNLOAD Jan 15 00:22:34.304000 audit[2650]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2628 pid=2650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313833613931396537626136313863616537663839613132623435 Jan 15 00:22:34.304000 audit: BPF prog-id=95 op=UNLOAD Jan 15 00:22:34.304000 audit[2650]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2628 pid=2650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313833613931396537626136313863616537663839613132623435 Jan 15 00:22:34.304000 audit: BPF prog-id=97 op=LOAD Jan 15 00:22:34.304000 audit[2650]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2628 pid=2650 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038313833613931396537626136313863616537663839613132623435 Jan 15 00:22:34.330264 update_engine[1682]: I20260115 00:22:34.330209 1682 update_attempter.cc:509] Updating boot flags... Jan 15 00:22:34.333959 containerd[1703]: time="2026-01-15T00:22:34.333906514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-n-1ddc109f0f,Uid:26fb68362dc2a643d15360cc1b2791a3,Namespace:kube-system,Attempt:0,} returns sandbox id \"0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b\"" Jan 15 00:22:34.335942 containerd[1703]: time="2026-01-15T00:22:34.335892720Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-n-1ddc109f0f,Uid:8bc3aec537de54b0a44b57386bb39227,Namespace:kube-system,Attempt:0,} returns sandbox id \"97b986f076ca85bd12656b191071cfe79b9479c2527fbb64cef116a95567d845\"" Jan 15 00:22:34.337834 containerd[1703]: time="2026-01-15T00:22:34.337784406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-n-1ddc109f0f,Uid:921845c7fba3dc0759018a9a18178d42,Namespace:kube-system,Attempt:0,} returns sandbox id \"08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876\"" Jan 15 00:22:34.339587 containerd[1703]: time="2026-01-15T00:22:34.339565411Z" level=info msg="CreateContainer within sandbox \"0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 15 00:22:34.340110 containerd[1703]: time="2026-01-15T00:22:34.340076733Z" level=info msg="CreateContainer within sandbox \"97b986f076ca85bd12656b191071cfe79b9479c2527fbb64cef116a95567d845\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 15 00:22:34.348526 containerd[1703]: time="2026-01-15T00:22:34.348477918Z" level=info msg="Container 66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:22:34.363105 containerd[1703]: time="2026-01-15T00:22:34.362679482Z" level=info msg="CreateContainer within sandbox \"08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 15 00:22:34.370694 kubelet[2544]: E0115 00:22:34.370644 2544 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.3.29:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-1ddc109f0f?timeout=10s\": dial tcp 10.0.3.29:6443: connect: connection refused" interval="800ms" Jan 15 00:22:34.375836 containerd[1703]: time="2026-01-15T00:22:34.375786962Z" level=info msg="CreateContainer within sandbox \"0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a\"" Jan 15 00:22:34.384250 containerd[1703]: time="2026-01-15T00:22:34.381556100Z" level=info msg="StartContainer for \"66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a\"" Jan 15 00:22:34.384250 containerd[1703]: time="2026-01-15T00:22:34.382632583Z" level=info msg="connecting to shim 66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a" address="unix:///run/containerd/s/446433a4535420d1034752d1284badbc2a2f0604c439763c17cae9ac6aa8e33c" protocol=ttrpc version=3 Jan 15 00:22:34.387789 containerd[1703]: time="2026-01-15T00:22:34.387687638Z" level=info msg="Container 265bcc13ef0763b693906fa95a92ae3d97b907a0d0e9bef37d63480e77f6db77: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:22:34.419791 containerd[1703]: time="2026-01-15T00:22:34.415800645Z" level=info msg="CreateContainer within sandbox \"97b986f076ca85bd12656b191071cfe79b9479c2527fbb64cef116a95567d845\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"265bcc13ef0763b693906fa95a92ae3d97b907a0d0e9bef37d63480e77f6db77\"" Jan 15 00:22:34.420086 containerd[1703]: time="2026-01-15T00:22:34.420018417Z" level=info msg="Container 652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:22:34.420903 containerd[1703]: time="2026-01-15T00:22:34.420035938Z" level=info msg="StartContainer for \"265bcc13ef0763b693906fa95a92ae3d97b907a0d0e9bef37d63480e77f6db77\"" Jan 15 00:22:34.421939 containerd[1703]: time="2026-01-15T00:22:34.421910103Z" level=info msg="connecting to shim 265bcc13ef0763b693906fa95a92ae3d97b907a0d0e9bef37d63480e77f6db77" address="unix:///run/containerd/s/617abd4beee0bc770bcec76525259867c4899c110bc2fdbff0f1664b27dec940" protocol=ttrpc version=3 Jan 15 00:22:34.435794 containerd[1703]: time="2026-01-15T00:22:34.434319821Z" level=info msg="CreateContainer within sandbox \"08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e\"" Jan 15 00:22:34.438565 containerd[1703]: time="2026-01-15T00:22:34.438528914Z" level=info msg="StartContainer for \"652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e\"" Jan 15 00:22:34.439619 containerd[1703]: time="2026-01-15T00:22:34.439587997Z" level=info msg="connecting to shim 652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e" address="unix:///run/containerd/s/f69039bc65a3950919086b0f712abb117d343e353eded86d44169ead0c9e2bf2" protocol=ttrpc version=3 Jan 15 00:22:34.478655 systemd[1]: Started cri-containerd-265bcc13ef0763b693906fa95a92ae3d97b907a0d0e9bef37d63480e77f6db77.scope - libcontainer container 265bcc13ef0763b693906fa95a92ae3d97b907a0d0e9bef37d63480e77f6db77. Jan 15 00:22:34.479886 systemd[1]: Started cri-containerd-652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e.scope - libcontainer container 652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e. Jan 15 00:22:34.480817 systemd[1]: Started cri-containerd-66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a.scope - libcontainer container 66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a. Jan 15 00:22:34.491000 audit: BPF prog-id=98 op=LOAD Jan 15 00:22:34.493000 audit: BPF prog-id=99 op=LOAD Jan 15 00:22:34.493000 audit[2740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=2605 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.493000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236356263633133656630373633623639333930366661393561393261 Jan 15 00:22:34.494000 audit: BPF prog-id=99 op=UNLOAD Jan 15 00:22:34.494000 audit[2740]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236356263633133656630373633623639333930366661393561393261 Jan 15 00:22:34.494000 audit: BPF prog-id=100 op=LOAD Jan 15 00:22:34.494000 audit[2740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=2605 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236356263633133656630373633623639333930366661393561393261 Jan 15 00:22:34.494000 audit: BPF prog-id=101 op=LOAD Jan 15 00:22:34.494000 audit[2740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=2605 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236356263633133656630373633623639333930366661393561393261 Jan 15 00:22:34.494000 audit: BPF prog-id=101 op=UNLOAD Jan 15 00:22:34.494000 audit[2740]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236356263633133656630373633623639333930366661393561393261 Jan 15 00:22:34.494000 audit: BPF prog-id=100 op=UNLOAD Jan 15 00:22:34.494000 audit[2740]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236356263633133656630373633623639333930366661393561393261 Jan 15 00:22:34.494000 audit: BPF prog-id=102 op=LOAD Jan 15 00:22:34.494000 audit[2740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=2605 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236356263633133656630373633623639333930366661393561393261 Jan 15 00:22:34.496000 audit: BPF prog-id=103 op=LOAD Jan 15 00:22:34.497000 audit: BPF prog-id=104 op=LOAD Jan 15 00:22:34.497000 audit[2753]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=2628 pid=2753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635326133623332356435646665383532663461363464313834303265 Jan 15 00:22:34.497000 audit: BPF prog-id=104 op=UNLOAD Jan 15 00:22:34.497000 audit[2753]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2628 pid=2753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635326133623332356435646665383532663461363464313834303265 Jan 15 00:22:34.497000 audit: BPF prog-id=105 op=LOAD Jan 15 00:22:34.497000 audit[2753]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=2628 pid=2753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635326133623332356435646665383532663461363464313834303265 Jan 15 00:22:34.497000 audit: BPF prog-id=106 op=LOAD Jan 15 00:22:34.497000 audit[2753]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=2628 pid=2753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635326133623332356435646665383532663461363464313834303265 Jan 15 00:22:34.497000 audit: BPF prog-id=106 op=UNLOAD Jan 15 00:22:34.497000 audit[2753]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2628 pid=2753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635326133623332356435646665383532663461363464313834303265 Jan 15 00:22:34.497000 audit: BPF prog-id=105 op=UNLOAD Jan 15 00:22:34.497000 audit[2753]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2628 pid=2753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635326133623332356435646665383532663461363464313834303265 Jan 15 00:22:34.497000 audit: BPF prog-id=107 op=LOAD Jan 15 00:22:34.497000 audit[2753]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=2628 pid=2753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.497000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635326133623332356435646665383532663461363464313834303265 Jan 15 00:22:34.498000 audit: BPF prog-id=108 op=LOAD Jan 15 00:22:34.499000 audit: BPF prog-id=109 op=LOAD Jan 15 00:22:34.499000 audit[2732]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=2585 pid=2732 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.499000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636616633343932303765343134623232393839653035653037333939 Jan 15 00:22:34.499000 audit: BPF prog-id=109 op=UNLOAD Jan 15 00:22:34.499000 audit[2732]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=2732 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.499000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636616633343932303765343134623232393839653035653037333939 Jan 15 00:22:34.499000 audit: BPF prog-id=110 op=LOAD Jan 15 00:22:34.499000 audit[2732]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=2585 pid=2732 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.499000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636616633343932303765343134623232393839653035653037333939 Jan 15 00:22:34.499000 audit: BPF prog-id=111 op=LOAD Jan 15 00:22:34.499000 audit[2732]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=2585 pid=2732 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.499000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636616633343932303765343134623232393839653035653037333939 Jan 15 00:22:34.499000 audit: BPF prog-id=111 op=UNLOAD Jan 15 00:22:34.499000 audit[2732]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=2732 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.499000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636616633343932303765343134623232393839653035653037333939 Jan 15 00:22:34.499000 audit: BPF prog-id=110 op=UNLOAD Jan 15 00:22:34.499000 audit[2732]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=2732 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.499000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636616633343932303765343134623232393839653035653037333939 Jan 15 00:22:34.499000 audit: BPF prog-id=112 op=LOAD Jan 15 00:22:34.499000 audit[2732]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=2585 pid=2732 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:34.499000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636616633343932303765343134623232393839653035653037333939 Jan 15 00:22:34.535052 containerd[1703]: time="2026-01-15T00:22:34.534393528Z" level=info msg="StartContainer for \"265bcc13ef0763b693906fa95a92ae3d97b907a0d0e9bef37d63480e77f6db77\" returns successfully" Jan 15 00:22:34.535052 containerd[1703]: time="2026-01-15T00:22:34.534532608Z" level=info msg="StartContainer for \"652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e\" returns successfully" Jan 15 00:22:34.536013 containerd[1703]: time="2026-01-15T00:22:34.535982373Z" level=info msg="StartContainer for \"66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a\" returns successfully" Jan 15 00:22:34.537485 kubelet[2544]: I0115 00:22:34.537405 2544 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.537905 kubelet[2544]: E0115 00:22:34.537728 2544 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.3.29:6443/api/v1/nodes\": dial tcp 10.0.3.29:6443: connect: connection refused" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.797622 kubelet[2544]: E0115 00:22:34.797041 2544 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.799141 kubelet[2544]: E0115 00:22:34.799112 2544 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:34.801908 kubelet[2544]: E0115 00:22:34.801888 2544 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:35.340666 kubelet[2544]: I0115 00:22:35.340603 2544 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:35.804533 kubelet[2544]: E0115 00:22:35.804503 2544 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:35.804719 kubelet[2544]: E0115 00:22:35.804574 2544 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:36.257291 kubelet[2544]: E0115 00:22:36.257166 2544 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4515-1-0-n-1ddc109f0f\" not found" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:36.342264 kubelet[2544]: I0115 00:22:36.342220 2544 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:36.367639 kubelet[2544]: I0115 00:22:36.367582 2544 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:36.378208 kubelet[2544]: E0115 00:22:36.378095 2544 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4515-1-0-n-1ddc109f0f.188abfad12a88e33 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4515-1-0-n-1ddc109f0f,UID:ci-4515-1-0-n-1ddc109f0f,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-n-1ddc109f0f,},FirstTimestamp:2026-01-15 00:22:33.759624755 +0000 UTC m=+0.902658965,LastTimestamp:2026-01-15 00:22:33.759624755 +0000 UTC m=+0.902658965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-n-1ddc109f0f,}" Jan 15 00:22:36.425648 kubelet[2544]: E0115 00:22:36.425607 2544 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:36.425648 kubelet[2544]: I0115 00:22:36.425643 2544 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:36.427770 kubelet[2544]: E0115 00:22:36.427742 2544 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-n-1ddc109f0f\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:36.427825 kubelet[2544]: I0115 00:22:36.427773 2544 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:36.429476 kubelet[2544]: E0115 00:22:36.429451 2544 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-n-1ddc109f0f\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:36.758096 kubelet[2544]: I0115 00:22:36.758035 2544 apiserver.go:52] "Watching apiserver" Jan 15 00:22:36.768059 kubelet[2544]: I0115 00:22:36.768021 2544 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 15 00:22:37.255655 kubelet[2544]: I0115 00:22:37.255611 2544 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:38.304584 systemd[1]: Reload requested from client PID 2840 ('systemctl') (unit session-9.scope)... Jan 15 00:22:38.304603 systemd[1]: Reloading... Jan 15 00:22:38.357494 kubelet[2544]: I0115 00:22:38.357446 2544 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:38.380268 zram_generator::config[2886]: No configuration found. Jan 15 00:22:38.567732 systemd[1]: Reloading finished in 262 ms. Jan 15 00:22:38.598310 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 00:22:38.613576 systemd[1]: kubelet.service: Deactivated successfully. Jan 15 00:22:38.613864 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 00:22:38.613000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:38.613937 systemd[1]: kubelet.service: Consumed 1.314s CPU time, 131M memory peak. Jan 15 00:22:38.614803 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 15 00:22:38.614865 kernel: audit: type=1131 audit(1768436558.613:405): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:38.615920 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 00:22:38.617000 audit: BPF prog-id=113 op=LOAD Jan 15 00:22:38.618723 kernel: audit: type=1334 audit(1768436558.617:406): prog-id=113 op=LOAD Jan 15 00:22:38.618778 kernel: audit: type=1334 audit(1768436558.617:407): prog-id=63 op=UNLOAD Jan 15 00:22:38.617000 audit: BPF prog-id=63 op=UNLOAD Jan 15 00:22:38.618000 audit: BPF prog-id=114 op=LOAD Jan 15 00:22:38.620658 kernel: audit: type=1334 audit(1768436558.618:408): prog-id=114 op=LOAD Jan 15 00:22:38.619000 audit: BPF prog-id=115 op=LOAD Jan 15 00:22:38.621516 kernel: audit: type=1334 audit(1768436558.619:409): prog-id=115 op=LOAD Jan 15 00:22:38.621589 kernel: audit: type=1334 audit(1768436558.619:410): prog-id=64 op=UNLOAD Jan 15 00:22:38.619000 audit: BPF prog-id=64 op=UNLOAD Jan 15 00:22:38.619000 audit: BPF prog-id=65 op=UNLOAD Jan 15 00:22:38.620000 audit: BPF prog-id=116 op=LOAD Jan 15 00:22:38.623995 kernel: audit: type=1334 audit(1768436558.619:411): prog-id=65 op=UNLOAD Jan 15 00:22:38.624033 kernel: audit: type=1334 audit(1768436558.620:412): prog-id=116 op=LOAD Jan 15 00:22:38.631000 audit: BPF prog-id=68 op=UNLOAD Jan 15 00:22:38.632000 audit: BPF prog-id=117 op=LOAD Jan 15 00:22:38.633503 kernel: audit: type=1334 audit(1768436558.631:413): prog-id=68 op=UNLOAD Jan 15 00:22:38.633543 kernel: audit: type=1334 audit(1768436558.632:414): prog-id=117 op=LOAD Jan 15 00:22:38.633000 audit: BPF prog-id=118 op=LOAD Jan 15 00:22:38.633000 audit: BPF prog-id=69 op=UNLOAD Jan 15 00:22:38.633000 audit: BPF prog-id=70 op=UNLOAD Jan 15 00:22:38.633000 audit: BPF prog-id=119 op=LOAD Jan 15 00:22:38.633000 audit: BPF prog-id=71 op=UNLOAD Jan 15 00:22:38.633000 audit: BPF prog-id=120 op=LOAD Jan 15 00:22:38.633000 audit: BPF prog-id=121 op=LOAD Jan 15 00:22:38.633000 audit: BPF prog-id=72 op=UNLOAD Jan 15 00:22:38.633000 audit: BPF prog-id=73 op=UNLOAD Jan 15 00:22:38.635000 audit: BPF prog-id=122 op=LOAD Jan 15 00:22:38.635000 audit: BPF prog-id=67 op=UNLOAD Jan 15 00:22:38.636000 audit: BPF prog-id=123 op=LOAD Jan 15 00:22:38.636000 audit: BPF prog-id=77 op=UNLOAD Jan 15 00:22:38.636000 audit: BPF prog-id=124 op=LOAD Jan 15 00:22:38.636000 audit: BPF prog-id=125 op=LOAD Jan 15 00:22:38.636000 audit: BPF prog-id=78 op=UNLOAD Jan 15 00:22:38.636000 audit: BPF prog-id=79 op=UNLOAD Jan 15 00:22:38.636000 audit: BPF prog-id=126 op=LOAD Jan 15 00:22:38.636000 audit: BPF prog-id=66 op=UNLOAD Jan 15 00:22:38.637000 audit: BPF prog-id=127 op=LOAD Jan 15 00:22:38.637000 audit: BPF prog-id=128 op=LOAD Jan 15 00:22:38.637000 audit: BPF prog-id=74 op=UNLOAD Jan 15 00:22:38.637000 audit: BPF prog-id=75 op=UNLOAD Jan 15 00:22:38.638000 audit: BPF prog-id=129 op=LOAD Jan 15 00:22:38.638000 audit: BPF prog-id=80 op=UNLOAD Jan 15 00:22:38.638000 audit: BPF prog-id=130 op=LOAD Jan 15 00:22:38.638000 audit: BPF prog-id=131 op=LOAD Jan 15 00:22:38.638000 audit: BPF prog-id=81 op=UNLOAD Jan 15 00:22:38.638000 audit: BPF prog-id=82 op=UNLOAD Jan 15 00:22:38.639000 audit: BPF prog-id=132 op=LOAD Jan 15 00:22:38.639000 audit: BPF prog-id=76 op=UNLOAD Jan 15 00:22:38.766947 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 00:22:38.766000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:38.778556 (kubelet)[2931]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 00:22:38.818593 kubelet[2931]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 00:22:38.818593 kubelet[2931]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 15 00:22:38.818593 kubelet[2931]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 00:22:38.818924 kubelet[2931]: I0115 00:22:38.818577 2931 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 00:22:38.824730 kubelet[2931]: I0115 00:22:38.824694 2931 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 15 00:22:38.824730 kubelet[2931]: I0115 00:22:38.824722 2931 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 00:22:38.825296 kubelet[2931]: I0115 00:22:38.825252 2931 server.go:954] "Client rotation is on, will bootstrap in background" Jan 15 00:22:38.828728 kubelet[2931]: I0115 00:22:38.828695 2931 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 15 00:22:38.831181 kubelet[2931]: I0115 00:22:38.831148 2931 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 00:22:38.835100 kubelet[2931]: I0115 00:22:38.835077 2931 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 15 00:22:38.837570 kubelet[2931]: I0115 00:22:38.837542 2931 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 00:22:38.837767 kubelet[2931]: I0115 00:22:38.837740 2931 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 00:22:38.837934 kubelet[2931]: I0115 00:22:38.837767 2931 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-n-1ddc109f0f","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 15 00:22:38.838010 kubelet[2931]: I0115 00:22:38.837941 2931 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 00:22:38.838010 kubelet[2931]: I0115 00:22:38.837950 2931 container_manager_linux.go:304] "Creating device plugin manager" Jan 15 00:22:38.838010 kubelet[2931]: I0115 00:22:38.837991 2931 state_mem.go:36] "Initialized new in-memory state store" Jan 15 00:22:38.838153 kubelet[2931]: I0115 00:22:38.838141 2931 kubelet.go:446] "Attempting to sync node with API server" Jan 15 00:22:38.838190 kubelet[2931]: I0115 00:22:38.838157 2931 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 00:22:38.838217 kubelet[2931]: I0115 00:22:38.838195 2931 kubelet.go:352] "Adding apiserver pod source" Jan 15 00:22:38.838217 kubelet[2931]: I0115 00:22:38.838212 2931 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 00:22:38.838959 kubelet[2931]: I0115 00:22:38.838840 2931 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 15 00:22:38.839669 kubelet[2931]: I0115 00:22:38.839594 2931 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 15 00:22:38.840661 kubelet[2931]: I0115 00:22:38.840610 2931 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 15 00:22:38.840661 kubelet[2931]: I0115 00:22:38.840667 2931 server.go:1287] "Started kubelet" Jan 15 00:22:38.842127 kubelet[2931]: I0115 00:22:38.841469 2931 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 00:22:38.842127 kubelet[2931]: I0115 00:22:38.841819 2931 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 00:22:38.842266 kubelet[2931]: I0115 00:22:38.841878 2931 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 00:22:38.843073 kubelet[2931]: I0115 00:22:38.843030 2931 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 00:22:38.843920 kubelet[2931]: I0115 00:22:38.843896 2931 server.go:479] "Adding debug handlers to kubelet server" Jan 15 00:22:38.847532 kubelet[2931]: I0115 00:22:38.847510 2931 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 15 00:22:38.850729 kubelet[2931]: E0115 00:22:38.850690 2931 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-1ddc109f0f\" not found" Jan 15 00:22:38.855181 kubelet[2931]: I0115 00:22:38.852384 2931 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 15 00:22:38.855181 kubelet[2931]: I0115 00:22:38.852510 2931 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 15 00:22:38.855181 kubelet[2931]: I0115 00:22:38.852667 2931 reconciler.go:26] "Reconciler: start to sync state" Jan 15 00:22:38.863621 kubelet[2931]: I0115 00:22:38.863293 2931 factory.go:221] Registration of the systemd container factory successfully Jan 15 00:22:38.863621 kubelet[2931]: I0115 00:22:38.863432 2931 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 00:22:38.865013 kubelet[2931]: I0115 00:22:38.864976 2931 factory.go:221] Registration of the containerd container factory successfully Jan 15 00:22:38.868784 kubelet[2931]: I0115 00:22:38.868713 2931 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 15 00:22:38.870880 kubelet[2931]: I0115 00:22:38.870823 2931 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 15 00:22:38.871028 kubelet[2931]: I0115 00:22:38.871014 2931 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 15 00:22:38.871102 kubelet[2931]: I0115 00:22:38.871090 2931 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 15 00:22:38.871151 kubelet[2931]: I0115 00:22:38.871143 2931 kubelet.go:2382] "Starting kubelet main sync loop" Jan 15 00:22:38.871266 kubelet[2931]: E0115 00:22:38.871247 2931 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 00:22:38.900761 kubelet[2931]: I0115 00:22:38.900730 2931 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 15 00:22:38.900761 kubelet[2931]: I0115 00:22:38.900755 2931 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 15 00:22:38.900875 kubelet[2931]: I0115 00:22:38.900791 2931 state_mem.go:36] "Initialized new in-memory state store" Jan 15 00:22:38.900969 kubelet[2931]: I0115 00:22:38.900950 2931 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 15 00:22:38.900994 kubelet[2931]: I0115 00:22:38.900967 2931 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 15 00:22:38.900994 kubelet[2931]: I0115 00:22:38.900985 2931 policy_none.go:49] "None policy: Start" Jan 15 00:22:38.900994 kubelet[2931]: I0115 00:22:38.900994 2931 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 15 00:22:38.901057 kubelet[2931]: I0115 00:22:38.901003 2931 state_mem.go:35] "Initializing new in-memory state store" Jan 15 00:22:38.901108 kubelet[2931]: I0115 00:22:38.901096 2931 state_mem.go:75] "Updated machine memory state" Jan 15 00:22:38.904931 kubelet[2931]: I0115 00:22:38.904594 2931 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 15 00:22:38.905130 kubelet[2931]: I0115 00:22:38.905114 2931 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 15 00:22:38.905271 kubelet[2931]: I0115 00:22:38.905229 2931 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 00:22:38.905648 kubelet[2931]: I0115 00:22:38.905627 2931 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 00:22:38.906934 kubelet[2931]: E0115 00:22:38.906903 2931 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 15 00:22:38.972605 kubelet[2931]: I0115 00:22:38.972563 2931 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:38.972853 kubelet[2931]: I0115 00:22:38.972604 2931 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:38.972997 kubelet[2931]: I0115 00:22:38.972691 2931 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:38.981918 kubelet[2931]: E0115 00:22:38.981887 2931 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-n-1ddc109f0f\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:38.983124 kubelet[2931]: E0115 00:22:38.983098 2931 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-n-1ddc109f0f\" already exists" pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.008464 kubelet[2931]: I0115 00:22:39.008444 2931 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.016808 kubelet[2931]: I0115 00:22:39.016679 2931 kubelet_node_status.go:124] "Node was previously registered" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.016808 kubelet[2931]: I0115 00:22:39.016758 2931 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.054581 kubelet[2931]: I0115 00:22:39.054517 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/8bc3aec537de54b0a44b57386bb39227-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-n-1ddc109f0f\" (UID: \"8bc3aec537de54b0a44b57386bb39227\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.054728 kubelet[2931]: I0115 00:22:39.054613 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.054728 kubelet[2931]: I0115 00:22:39.054693 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.054728 kubelet[2931]: I0115 00:22:39.054748 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.054950 kubelet[2931]: I0115 00:22:39.054797 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/8bc3aec537de54b0a44b57386bb39227-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-n-1ddc109f0f\" (UID: \"8bc3aec537de54b0a44b57386bb39227\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.054950 kubelet[2931]: I0115 00:22:39.054848 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/8bc3aec537de54b0a44b57386bb39227-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-n-1ddc109f0f\" (UID: \"8bc3aec537de54b0a44b57386bb39227\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.054950 kubelet[2931]: I0115 00:22:39.054898 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.055052 kubelet[2931]: I0115 00:22:39.054950 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/921845c7fba3dc0759018a9a18178d42-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-n-1ddc109f0f\" (UID: \"921845c7fba3dc0759018a9a18178d42\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.055052 kubelet[2931]: I0115 00:22:39.055000 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/26fb68362dc2a643d15360cc1b2791a3-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-n-1ddc109f0f\" (UID: \"26fb68362dc2a643d15360cc1b2791a3\") " pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.839383 kubelet[2931]: I0115 00:22:39.839250 2931 apiserver.go:52] "Watching apiserver" Jan 15 00:22:39.853625 kubelet[2931]: I0115 00:22:39.853544 2931 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 15 00:22:39.886595 kubelet[2931]: I0115 00:22:39.886471 2931 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.886595 kubelet[2931]: I0115 00:22:39.886556 2931 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.892427 kubelet[2931]: E0115 00:22:39.892384 2931 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-n-1ddc109f0f\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.894312 kubelet[2931]: E0115 00:22:39.894279 2931 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-n-1ddc109f0f\" already exists" pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" Jan 15 00:22:39.907651 kubelet[2931]: I0115 00:22:39.907586 2931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4515-1-0-n-1ddc109f0f" podStartSLOduration=1.9075516970000002 podStartE2EDuration="1.907551697s" podCreationTimestamp="2026-01-15 00:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 00:22:39.907255257 +0000 UTC m=+1.124838604" watchObservedRunningTime="2026-01-15 00:22:39.907551697 +0000 UTC m=+1.125135044" Jan 15 00:22:39.924722 kubelet[2931]: I0115 00:22:39.924637 2931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" podStartSLOduration=1.92460971 podStartE2EDuration="1.92460971s" podCreationTimestamp="2026-01-15 00:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 00:22:39.915832403 +0000 UTC m=+1.133415750" watchObservedRunningTime="2026-01-15 00:22:39.92460971 +0000 UTC m=+1.142193057" Jan 15 00:22:39.924920 kubelet[2931]: I0115 00:22:39.924759 2931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4515-1-0-n-1ddc109f0f" podStartSLOduration=2.92475427 podStartE2EDuration="2.92475427s" podCreationTimestamp="2026-01-15 00:22:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 00:22:39.924472429 +0000 UTC m=+1.142055776" watchObservedRunningTime="2026-01-15 00:22:39.92475427 +0000 UTC m=+1.142337657" Jan 15 00:22:44.993023 kubelet[2931]: I0115 00:22:44.992965 2931 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 15 00:22:44.993370 containerd[1703]: time="2026-01-15T00:22:44.993283663Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 15 00:22:44.993555 kubelet[2931]: I0115 00:22:44.993443 2931 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 15 00:22:45.758846 systemd[1]: Created slice kubepods-besteffort-pod36f2baab_83c8_472b_b9a9_93ee8d546bae.slice - libcontainer container kubepods-besteffort-pod36f2baab_83c8_472b_b9a9_93ee8d546bae.slice. Jan 15 00:22:45.798805 kubelet[2931]: I0115 00:22:45.798765 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/36f2baab-83c8-472b-b9a9-93ee8d546bae-kube-proxy\") pod \"kube-proxy-5p8xs\" (UID: \"36f2baab-83c8-472b-b9a9-93ee8d546bae\") " pod="kube-system/kube-proxy-5p8xs" Jan 15 00:22:45.798805 kubelet[2931]: I0115 00:22:45.798808 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/36f2baab-83c8-472b-b9a9-93ee8d546bae-xtables-lock\") pod \"kube-proxy-5p8xs\" (UID: \"36f2baab-83c8-472b-b9a9-93ee8d546bae\") " pod="kube-system/kube-proxy-5p8xs" Jan 15 00:22:45.798971 kubelet[2931]: I0115 00:22:45.798824 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz2xp\" (UniqueName: \"kubernetes.io/projected/36f2baab-83c8-472b-b9a9-93ee8d546bae-kube-api-access-nz2xp\") pod \"kube-proxy-5p8xs\" (UID: \"36f2baab-83c8-472b-b9a9-93ee8d546bae\") " pod="kube-system/kube-proxy-5p8xs" Jan 15 00:22:45.798971 kubelet[2931]: I0115 00:22:45.798846 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36f2baab-83c8-472b-b9a9-93ee8d546bae-lib-modules\") pod \"kube-proxy-5p8xs\" (UID: \"36f2baab-83c8-472b-b9a9-93ee8d546bae\") " pod="kube-system/kube-proxy-5p8xs" Jan 15 00:22:46.079197 containerd[1703]: time="2026-01-15T00:22:46.079136784Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5p8xs,Uid:36f2baab-83c8-472b-b9a9-93ee8d546bae,Namespace:kube-system,Attempt:0,}" Jan 15 00:22:46.105267 containerd[1703]: time="2026-01-15T00:22:46.104494142Z" level=info msg="connecting to shim 8fd85773f4a3a1bdb5c282f19c1ff8010859b09733c7170db160cce2e7d4dadd" address="unix:///run/containerd/s/a805d5e3d65b86184e608849439d1b9f78144f8abb77e47a3e9761dc5bb47743" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:22:46.135056 systemd[1]: Created slice kubepods-besteffort-podabfc9c6e_3f66_4f4e_8ad6_c72f5664bf0b.slice - libcontainer container kubepods-besteffort-podabfc9c6e_3f66_4f4e_8ad6_c72f5664bf0b.slice. Jan 15 00:22:46.154625 systemd[1]: Started cri-containerd-8fd85773f4a3a1bdb5c282f19c1ff8010859b09733c7170db160cce2e7d4dadd.scope - libcontainer container 8fd85773f4a3a1bdb5c282f19c1ff8010859b09733c7170db160cce2e7d4dadd. Jan 15 00:22:46.163000 audit: BPF prog-id=133 op=LOAD Jan 15 00:22:46.165103 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 15 00:22:46.165167 kernel: audit: type=1334 audit(1768436566.163:447): prog-id=133 op=LOAD Jan 15 00:22:46.165199 kernel: audit: type=1334 audit(1768436566.163:448): prog-id=134 op=LOAD Jan 15 00:22:46.163000 audit: BPF prog-id=134 op=LOAD Jan 15 00:22:46.163000 audit[2999]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.169395 kernel: audit: type=1300 audit(1768436566.163:448): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.163000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.172761 kernel: audit: type=1327 audit(1768436566.163:448): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.164000 audit: BPF prog-id=134 op=UNLOAD Jan 15 00:22:46.173722 kernel: audit: type=1334 audit(1768436566.164:449): prog-id=134 op=UNLOAD Jan 15 00:22:46.173759 kernel: audit: type=1300 audit(1768436566.164:449): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.164000 audit[2999]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.164000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.181466 kernel: audit: type=1327 audit(1768436566.164:449): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.181745 kernel: audit: type=1334 audit(1768436566.165:450): prog-id=135 op=LOAD Jan 15 00:22:46.165000 audit: BPF prog-id=135 op=LOAD Jan 15 00:22:46.165000 audit[2999]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.185864 kernel: audit: type=1300 audit(1768436566.165:450): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.165000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.189765 kernel: audit: type=1327 audit(1768436566.165:450): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.165000 audit: BPF prog-id=136 op=LOAD Jan 15 00:22:46.165000 audit[2999]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.165000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.172000 audit: BPF prog-id=136 op=UNLOAD Jan 15 00:22:46.172000 audit[2999]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.172000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.172000 audit: BPF prog-id=135 op=UNLOAD Jan 15 00:22:46.172000 audit[2999]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.172000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.172000 audit: BPF prog-id=137 op=LOAD Jan 15 00:22:46.172000 audit[2999]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=2988 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.172000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866643835373733663461336131626462356332383266313963316666 Jan 15 00:22:46.201034 kubelet[2931]: I0115 00:22:46.200992 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g6gz\" (UniqueName: \"kubernetes.io/projected/abfc9c6e-3f66-4f4e-8ad6-c72f5664bf0b-kube-api-access-4g6gz\") pod \"tigera-operator-7dcd859c48-dtjxv\" (UID: \"abfc9c6e-3f66-4f4e-8ad6-c72f5664bf0b\") " pod="tigera-operator/tigera-operator-7dcd859c48-dtjxv" Jan 15 00:22:46.201034 kubelet[2931]: I0115 00:22:46.201034 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/abfc9c6e-3f66-4f4e-8ad6-c72f5664bf0b-var-lib-calico\") pod \"tigera-operator-7dcd859c48-dtjxv\" (UID: \"abfc9c6e-3f66-4f4e-8ad6-c72f5664bf0b\") " pod="tigera-operator/tigera-operator-7dcd859c48-dtjxv" Jan 15 00:22:46.201460 containerd[1703]: time="2026-01-15T00:22:46.201418399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5p8xs,Uid:36f2baab-83c8-472b-b9a9-93ee8d546bae,Namespace:kube-system,Attempt:0,} returns sandbox id \"8fd85773f4a3a1bdb5c282f19c1ff8010859b09733c7170db160cce2e7d4dadd\"" Jan 15 00:22:46.204141 containerd[1703]: time="2026-01-15T00:22:46.204106407Z" level=info msg="CreateContainer within sandbox \"8fd85773f4a3a1bdb5c282f19c1ff8010859b09733c7170db160cce2e7d4dadd\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 15 00:22:46.213377 containerd[1703]: time="2026-01-15T00:22:46.213342475Z" level=info msg="Container b956648cffd3739ea0ffaceed79148c37448843faa2ca8ea13d7cf45c8296e1e: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:22:46.215999 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount51480305.mount: Deactivated successfully. Jan 15 00:22:46.222406 containerd[1703]: time="2026-01-15T00:22:46.222367623Z" level=info msg="CreateContainer within sandbox \"8fd85773f4a3a1bdb5c282f19c1ff8010859b09733c7170db160cce2e7d4dadd\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b956648cffd3739ea0ffaceed79148c37448843faa2ca8ea13d7cf45c8296e1e\"" Jan 15 00:22:46.223346 containerd[1703]: time="2026-01-15T00:22:46.223295705Z" level=info msg="StartContainer for \"b956648cffd3739ea0ffaceed79148c37448843faa2ca8ea13d7cf45c8296e1e\"" Jan 15 00:22:46.225402 containerd[1703]: time="2026-01-15T00:22:46.225374552Z" level=info msg="connecting to shim b956648cffd3739ea0ffaceed79148c37448843faa2ca8ea13d7cf45c8296e1e" address="unix:///run/containerd/s/a805d5e3d65b86184e608849439d1b9f78144f8abb77e47a3e9761dc5bb47743" protocol=ttrpc version=3 Jan 15 00:22:46.247479 systemd[1]: Started cri-containerd-b956648cffd3739ea0ffaceed79148c37448843faa2ca8ea13d7cf45c8296e1e.scope - libcontainer container b956648cffd3739ea0ffaceed79148c37448843faa2ca8ea13d7cf45c8296e1e. Jan 15 00:22:46.301000 audit: BPF prog-id=138 op=LOAD Jan 15 00:22:46.301000 audit[3024]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=2988 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239353636343863666664333733396561306666616365656437393134 Jan 15 00:22:46.302000 audit: BPF prog-id=139 op=LOAD Jan 15 00:22:46.302000 audit[3024]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=2988 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239353636343863666664333733396561306666616365656437393134 Jan 15 00:22:46.302000 audit: BPF prog-id=139 op=UNLOAD Jan 15 00:22:46.302000 audit[3024]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2988 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239353636343863666664333733396561306666616365656437393134 Jan 15 00:22:46.302000 audit: BPF prog-id=138 op=UNLOAD Jan 15 00:22:46.302000 audit[3024]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2988 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239353636343863666664333733396561306666616365656437393134 Jan 15 00:22:46.302000 audit: BPF prog-id=140 op=LOAD Jan 15 00:22:46.302000 audit[3024]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=2988 pid=3024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239353636343863666664333733396561306666616365656437393134 Jan 15 00:22:46.324884 containerd[1703]: time="2026-01-15T00:22:46.324850616Z" level=info msg="StartContainer for \"b956648cffd3739ea0ffaceed79148c37448843faa2ca8ea13d7cf45c8296e1e\" returns successfully" Jan 15 00:22:46.438654 containerd[1703]: time="2026-01-15T00:22:46.438547324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-dtjxv,Uid:abfc9c6e-3f66-4f4e-8ad6-c72f5664bf0b,Namespace:tigera-operator,Attempt:0,}" Jan 15 00:22:46.461312 containerd[1703]: time="2026-01-15T00:22:46.461262513Z" level=info msg="connecting to shim 6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677" address="unix:///run/containerd/s/b1a6abcd76c5109a1fefe0941d607c8a89b208f1c06a4a9b62e64e3e0f4b6700" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:22:46.485000 audit[3120]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.485000 audit[3120]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffde037240 a2=0 a3=1 items=0 ppid=3036 pid=3120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.485000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 15 00:22:46.486229 systemd[1]: Started cri-containerd-6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677.scope - libcontainer container 6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677. Jan 15 00:22:46.487000 audit[3121]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.487000 audit[3121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffdad384d0 a2=0 a3=1 items=0 ppid=3036 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.487000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 15 00:22:46.487000 audit[3122]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3122 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.487000 audit[3122]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffde8f0d90 a2=0 a3=1 items=0 ppid=3036 pid=3122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.487000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 15 00:22:46.488000 audit[3123]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=3123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.488000 audit[3123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd53b6da0 a2=0 a3=1 items=0 ppid=3036 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.488000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 15 00:22:46.489000 audit[3124]: NETFILTER_CFG table=filter:58 family=2 entries=1 op=nft_register_chain pid=3124 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.489000 audit[3124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcc1bb6e0 a2=0 a3=1 items=0 ppid=3036 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.489000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 15 00:22:46.490000 audit[3125]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3125 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.490000 audit[3125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff7af0920 a2=0 a3=1 items=0 ppid=3036 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.490000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 15 00:22:46.503000 audit: BPF prog-id=141 op=LOAD Jan 15 00:22:46.504000 audit: BPF prog-id=142 op=LOAD Jan 15 00:22:46.504000 audit[3096]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=3077 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.504000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664623966316636613034643737383736393939653364643834386465 Jan 15 00:22:46.504000 audit: BPF prog-id=142 op=UNLOAD Jan 15 00:22:46.504000 audit[3096]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3077 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.504000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664623966316636613034643737383736393939653364643834386465 Jan 15 00:22:46.504000 audit: BPF prog-id=143 op=LOAD Jan 15 00:22:46.504000 audit[3096]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3077 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.504000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664623966316636613034643737383736393939653364643834386465 Jan 15 00:22:46.504000 audit: BPF prog-id=144 op=LOAD Jan 15 00:22:46.504000 audit[3096]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3077 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.504000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664623966316636613034643737383736393939653364643834386465 Jan 15 00:22:46.504000 audit: BPF prog-id=144 op=UNLOAD Jan 15 00:22:46.504000 audit[3096]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3077 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.504000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664623966316636613034643737383736393939653364643834386465 Jan 15 00:22:46.504000 audit: BPF prog-id=143 op=UNLOAD Jan 15 00:22:46.504000 audit[3096]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3077 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.504000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664623966316636613034643737383736393939653364643834386465 Jan 15 00:22:46.504000 audit: BPF prog-id=145 op=LOAD Jan 15 00:22:46.504000 audit[3096]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3077 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.504000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3664623966316636613034643737383736393939653364643834386465 Jan 15 00:22:46.525415 containerd[1703]: time="2026-01-15T00:22:46.525353630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-dtjxv,Uid:abfc9c6e-3f66-4f4e-8ad6-c72f5664bf0b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677\"" Jan 15 00:22:46.527839 containerd[1703]: time="2026-01-15T00:22:46.527813077Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 15 00:22:46.588000 audit[3140]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.588000 audit[3140]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffc9d7bab0 a2=0 a3=1 items=0 ppid=3036 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.588000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 15 00:22:46.591000 audit[3142]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.591000 audit[3142]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffda000bf0 a2=0 a3=1 items=0 ppid=3036 pid=3142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.591000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 15 00:22:46.594000 audit[3145]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.594000 audit[3145]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd220e640 a2=0 a3=1 items=0 ppid=3036 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.594000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 15 00:22:46.595000 audit[3146]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.595000 audit[3146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffda340d60 a2=0 a3=1 items=0 ppid=3036 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.595000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 15 00:22:46.598000 audit[3148]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.598000 audit[3148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffca3a73d0 a2=0 a3=1 items=0 ppid=3036 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.598000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 15 00:22:46.599000 audit[3149]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.599000 audit[3149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc1310d00 a2=0 a3=1 items=0 ppid=3036 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.599000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 15 00:22:46.602000 audit[3151]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.602000 audit[3151]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffe4cac250 a2=0 a3=1 items=0 ppid=3036 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.602000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 15 00:22:46.606000 audit[3154]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.606000 audit[3154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffeec6d7a0 a2=0 a3=1 items=0 ppid=3036 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.606000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 15 00:22:46.607000 audit[3155]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.607000 audit[3155]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd3c6a7c0 a2=0 a3=1 items=0 ppid=3036 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.607000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 15 00:22:46.610000 audit[3157]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3157 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.610000 audit[3157]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffffeb53d50 a2=0 a3=1 items=0 ppid=3036 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.610000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 15 00:22:46.611000 audit[3158]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3158 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.611000 audit[3158]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd6afe820 a2=0 a3=1 items=0 ppid=3036 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.611000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 15 00:22:46.614000 audit[3160]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3160 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.614000 audit[3160]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe90e2060 a2=0 a3=1 items=0 ppid=3036 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.614000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 15 00:22:46.618000 audit[3163]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3163 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.618000 audit[3163]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcfec4100 a2=0 a3=1 items=0 ppid=3036 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.618000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 15 00:22:46.621000 audit[3166]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3166 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.621000 audit[3166]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffec32e3d0 a2=0 a3=1 items=0 ppid=3036 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.621000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 15 00:22:46.623000 audit[3167]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.623000 audit[3167]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc13b9450 a2=0 a3=1 items=0 ppid=3036 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.623000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 15 00:22:46.625000 audit[3169]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3169 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.625000 audit[3169]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffd02214e0 a2=0 a3=1 items=0 ppid=3036 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.625000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 00:22:46.629000 audit[3172]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3172 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.629000 audit[3172]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffcd688de0 a2=0 a3=1 items=0 ppid=3036 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.629000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 00:22:46.630000 audit[3173]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3173 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.630000 audit[3173]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffeb8ae1b0 a2=0 a3=1 items=0 ppid=3036 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.630000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 15 00:22:46.633000 audit[3175]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3175 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 00:22:46.633000 audit[3175]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffc15926d0 a2=0 a3=1 items=0 ppid=3036 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.633000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 15 00:22:46.657000 audit[3181]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3181 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:46.657000 audit[3181]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff0dea880 a2=0 a3=1 items=0 ppid=3036 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.657000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:46.673000 audit[3181]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3181 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:46.673000 audit[3181]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=fffff0dea880 a2=0 a3=1 items=0 ppid=3036 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.673000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:46.675000 audit[3186]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.675000 audit[3186]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffffd4466b0 a2=0 a3=1 items=0 ppid=3036 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.675000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 15 00:22:46.679000 audit[3188]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3188 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.679000 audit[3188]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffd09e7600 a2=0 a3=1 items=0 ppid=3036 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.679000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 15 00:22:46.683000 audit[3191]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3191 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.683000 audit[3191]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd61b6bd0 a2=0 a3=1 items=0 ppid=3036 pid=3191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.683000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 15 00:22:46.684000 audit[3192]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.684000 audit[3192]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff333e8c0 a2=0 a3=1 items=0 ppid=3036 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.684000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 15 00:22:46.687000 audit[3194]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3194 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.687000 audit[3194]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd60334b0 a2=0 a3=1 items=0 ppid=3036 pid=3194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.687000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 15 00:22:46.688000 audit[3195]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3195 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.688000 audit[3195]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffeb112000 a2=0 a3=1 items=0 ppid=3036 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.688000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 15 00:22:46.692000 audit[3197]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3197 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.692000 audit[3197]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd766e6e0 a2=0 a3=1 items=0 ppid=3036 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.692000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 15 00:22:46.695000 audit[3200]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3200 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.695000 audit[3200]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffd662ac40 a2=0 a3=1 items=0 ppid=3036 pid=3200 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.695000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 15 00:22:46.696000 audit[3201]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3201 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.696000 audit[3201]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd8295520 a2=0 a3=1 items=0 ppid=3036 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.696000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 15 00:22:46.699000 audit[3203]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.699000 audit[3203]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffec8cc160 a2=0 a3=1 items=0 ppid=3036 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.699000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 15 00:22:46.700000 audit[3204]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3204 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.700000 audit[3204]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe99b0290 a2=0 a3=1 items=0 ppid=3036 pid=3204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.700000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 15 00:22:46.702000 audit[3206]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3206 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.702000 audit[3206]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffdc18f6c0 a2=0 a3=1 items=0 ppid=3036 pid=3206 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.702000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 15 00:22:46.706000 audit[3209]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3209 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.706000 audit[3209]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff6dcecb0 a2=0 a3=1 items=0 ppid=3036 pid=3209 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.706000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 15 00:22:46.710000 audit[3212]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3212 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.710000 audit[3212]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe477e300 a2=0 a3=1 items=0 ppid=3036 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.710000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 15 00:22:46.711000 audit[3213]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.711000 audit[3213]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe49155d0 a2=0 a3=1 items=0 ppid=3036 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.711000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 15 00:22:46.714000 audit[3215]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3215 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.714000 audit[3215]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffdeb2f7e0 a2=0 a3=1 items=0 ppid=3036 pid=3215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.714000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 00:22:46.719000 audit[3218]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.719000 audit[3218]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffce98a7a0 a2=0 a3=1 items=0 ppid=3036 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.719000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 00:22:46.720000 audit[3219]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3219 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.720000 audit[3219]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe771ef50 a2=0 a3=1 items=0 ppid=3036 pid=3219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.720000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 15 00:22:46.722000 audit[3221]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3221 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.722000 audit[3221]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffdf5e79b0 a2=0 a3=1 items=0 ppid=3036 pid=3221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.722000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 15 00:22:46.724000 audit[3222]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3222 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.724000 audit[3222]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc61faaa0 a2=0 a3=1 items=0 ppid=3036 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.724000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 15 00:22:46.726000 audit[3224]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.726000 audit[3224]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd1728b60 a2=0 a3=1 items=0 ppid=3036 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.726000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 00:22:46.729000 audit[3227]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3227 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 00:22:46.729000 audit[3227]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffeb673790 a2=0 a3=1 items=0 ppid=3036 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.729000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 00:22:46.732000 audit[3229]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3229 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 15 00:22:46.732000 audit[3229]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffcb7a3570 a2=0 a3=1 items=0 ppid=3036 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.732000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:46.733000 audit[3229]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3229 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 15 00:22:46.733000 audit[3229]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffcb7a3570 a2=0 a3=1 items=0 ppid=3036 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:46.733000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:46.916718 kubelet[2931]: I0115 00:22:46.916639 2931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5p8xs" podStartSLOduration=1.9166071059999998 podStartE2EDuration="1.916607106s" podCreationTimestamp="2026-01-15 00:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 00:22:46.916370466 +0000 UTC m=+8.133953853" watchObservedRunningTime="2026-01-15 00:22:46.916607106 +0000 UTC m=+8.134190413" Jan 15 00:22:48.439803 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4448408.mount: Deactivated successfully. Jan 15 00:22:48.702954 containerd[1703]: time="2026-01-15T00:22:48.702832651Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:48.704462 containerd[1703]: time="2026-01-15T00:22:48.704412295Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=0" Jan 15 00:22:48.705874 containerd[1703]: time="2026-01-15T00:22:48.705837820Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:48.708759 containerd[1703]: time="2026-01-15T00:22:48.708716869Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:22:48.709377 containerd[1703]: time="2026-01-15T00:22:48.709345550Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.181465913s" Jan 15 00:22:48.709406 containerd[1703]: time="2026-01-15T00:22:48.709377351Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 15 00:22:48.711558 containerd[1703]: time="2026-01-15T00:22:48.711525197Z" level=info msg="CreateContainer within sandbox \"6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 15 00:22:48.721187 containerd[1703]: time="2026-01-15T00:22:48.721143787Z" level=info msg="Container 74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:22:48.727343 containerd[1703]: time="2026-01-15T00:22:48.727306685Z" level=info msg="CreateContainer within sandbox \"6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195\"" Jan 15 00:22:48.727917 containerd[1703]: time="2026-01-15T00:22:48.727884647Z" level=info msg="StartContainer for \"74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195\"" Jan 15 00:22:48.729077 containerd[1703]: time="2026-01-15T00:22:48.729023851Z" level=info msg="connecting to shim 74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195" address="unix:///run/containerd/s/b1a6abcd76c5109a1fefe0941d607c8a89b208f1c06a4a9b62e64e3e0f4b6700" protocol=ttrpc version=3 Jan 15 00:22:48.752415 systemd[1]: Started cri-containerd-74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195.scope - libcontainer container 74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195. Jan 15 00:22:48.762000 audit: BPF prog-id=146 op=LOAD Jan 15 00:22:48.762000 audit: BPF prog-id=147 op=LOAD Jan 15 00:22:48.762000 audit[3238]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=3077 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:48.762000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734656439303739643130326334323630313762343830653865393136 Jan 15 00:22:48.763000 audit: BPF prog-id=147 op=UNLOAD Jan 15 00:22:48.763000 audit[3238]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3077 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:48.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734656439303739643130326334323630313762343830653865393136 Jan 15 00:22:48.763000 audit: BPF prog-id=148 op=LOAD Jan 15 00:22:48.763000 audit[3238]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3077 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:48.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734656439303739643130326334323630313762343830653865393136 Jan 15 00:22:48.763000 audit: BPF prog-id=149 op=LOAD Jan 15 00:22:48.763000 audit[3238]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3077 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:48.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734656439303739643130326334323630313762343830653865393136 Jan 15 00:22:48.763000 audit: BPF prog-id=149 op=UNLOAD Jan 15 00:22:48.763000 audit[3238]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3077 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:48.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734656439303739643130326334323630313762343830653865393136 Jan 15 00:22:48.763000 audit: BPF prog-id=148 op=UNLOAD Jan 15 00:22:48.763000 audit[3238]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3077 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:48.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734656439303739643130326334323630313762343830653865393136 Jan 15 00:22:48.763000 audit: BPF prog-id=150 op=LOAD Jan 15 00:22:48.763000 audit[3238]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3077 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:48.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3734656439303739643130326334323630313762343830653865393136 Jan 15 00:22:48.779016 containerd[1703]: time="2026-01-15T00:22:48.778939283Z" level=info msg="StartContainer for \"74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195\" returns successfully" Jan 15 00:22:49.680326 kubelet[2931]: I0115 00:22:49.680243 2931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-dtjxv" podStartSLOduration=1.496476119 podStartE2EDuration="3.68022692s" podCreationTimestamp="2026-01-15 00:22:46 +0000 UTC" firstStartedPulling="2026-01-15 00:22:46.526619433 +0000 UTC m=+7.744202780" lastFinishedPulling="2026-01-15 00:22:48.710370234 +0000 UTC m=+9.927953581" observedRunningTime="2026-01-15 00:22:48.920240996 +0000 UTC m=+10.137824423" watchObservedRunningTime="2026-01-15 00:22:49.68022692 +0000 UTC m=+10.897810267" Jan 15 00:22:53.900801 sudo[1981]: pam_unix(sudo:session): session closed for user root Jan 15 00:22:53.900000 audit[1981]: USER_END pid=1981 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:53.905648 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 15 00:22:53.905746 kernel: audit: type=1106 audit(1768436573.900:527): pid=1981 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:53.905772 kernel: audit: type=1104 audit(1768436573.900:528): pid=1981 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:53.900000 audit[1981]: CRED_DISP pid=1981 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 00:22:54.000529 sshd[1980]: Connection closed by 20.161.92.111 port 37720 Jan 15 00:22:54.000861 sshd-session[1961]: pam_unix(sshd:session): session closed for user core Jan 15 00:22:54.002000 audit[1961]: USER_END pid=1961 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:54.006678 systemd[1]: sshd@8-10.0.3.29:22-20.161.92.111:37720.service: Deactivated successfully. Jan 15 00:22:54.006956 systemd-logind[1681]: Session 9 logged out. Waiting for processes to exit. Jan 15 00:22:54.002000 audit[1961]: CRED_DISP pid=1961 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:54.011029 kernel: audit: type=1106 audit(1768436574.002:529): pid=1961 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:54.011135 kernel: audit: type=1104 audit(1768436574.002:530): pid=1961 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:22:54.011161 kernel: audit: type=1131 audit(1768436574.007:531): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.3.29:22-20.161.92.111:37720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:54.007000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.3.29:22-20.161.92.111:37720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:22:54.014689 systemd[1]: session-9.scope: Deactivated successfully. Jan 15 00:22:54.017464 systemd[1]: session-9.scope: Consumed 8.291s CPU time, 220.7M memory peak. Jan 15 00:22:54.021480 systemd-logind[1681]: Removed session 9. Jan 15 00:22:54.623000 audit[3329]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:54.623000 audit[3329]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffee0f1c50 a2=0 a3=1 items=0 ppid=3036 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:54.630066 kernel: audit: type=1325 audit(1768436574.623:532): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:54.630999 kernel: audit: type=1300 audit(1768436574.623:532): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffee0f1c50 a2=0 a3=1 items=0 ppid=3036 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:54.623000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:54.635201 kernel: audit: type=1327 audit(1768436574.623:532): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:54.633000 audit[3329]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:54.638887 kernel: audit: type=1325 audit(1768436574.633:533): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:54.633000 audit[3329]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffee0f1c50 a2=0 a3=1 items=0 ppid=3036 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:54.644282 kernel: audit: type=1300 audit(1768436574.633:533): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffee0f1c50 a2=0 a3=1 items=0 ppid=3036 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:54.633000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:54.650000 audit[3331]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:54.650000 audit[3331]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=fffffb107080 a2=0 a3=1 items=0 ppid=3036 pid=3331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:54.650000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:54.658000 audit[3331]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:54.658000 audit[3331]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffb107080 a2=0 a3=1 items=0 ppid=3036 pid=3331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:54.658000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:58.975000 audit[3334]: NETFILTER_CFG table=filter:109 family=2 entries=16 op=nft_register_rule pid=3334 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:58.976341 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 15 00:22:58.976378 kernel: audit: type=1325 audit(1768436578.975:536): table=filter:109 family=2 entries=16 op=nft_register_rule pid=3334 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:58.975000 audit[3334]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffcf17fa30 a2=0 a3=1 items=0 ppid=3036 pid=3334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:58.982822 kernel: audit: type=1300 audit(1768436578.975:536): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffcf17fa30 a2=0 a3=1 items=0 ppid=3036 pid=3334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:58.975000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:58.984759 kernel: audit: type=1327 audit(1768436578.975:536): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:58.986000 audit[3334]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3334 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:58.990188 kernel: audit: type=1325 audit(1768436578.986:537): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3334 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:58.986000 audit[3334]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcf17fa30 a2=0 a3=1 items=0 ppid=3036 pid=3334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:58.995191 kernel: audit: type=1300 audit(1768436578.986:537): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcf17fa30 a2=0 a3=1 items=0 ppid=3036 pid=3334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:58.995261 kernel: audit: type=1327 audit(1768436578.986:537): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:58.986000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:58.998000 audit[3336]: NETFILTER_CFG table=filter:111 family=2 entries=17 op=nft_register_rule pid=3336 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:58.998000 audit[3336]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffffb006660 a2=0 a3=1 items=0 ppid=3036 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:59.004699 kernel: audit: type=1325 audit(1768436578.998:538): table=filter:111 family=2 entries=17 op=nft_register_rule pid=3336 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:59.004774 kernel: audit: type=1300 audit(1768436578.998:538): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffffb006660 a2=0 a3=1 items=0 ppid=3036 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:59.004796 kernel: audit: type=1327 audit(1768436578.998:538): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:58.998000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:59.007000 audit[3336]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3336 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:22:59.007000 audit[3336]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffb006660 a2=0 a3=1 items=0 ppid=3036 pid=3336 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:22:59.007000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:22:59.010194 kernel: audit: type=1325 audit(1768436579.007:539): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3336 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:00.021000 audit[3338]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3338 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:00.021000 audit[3338]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff3c80070 a2=0 a3=1 items=0 ppid=3036 pid=3338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:00.021000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:00.027000 audit[3338]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3338 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:00.027000 audit[3338]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff3c80070 a2=0 a3=1 items=0 ppid=3036 pid=3338 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:00.027000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:01.697000 audit[3340]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3340 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:01.697000 audit[3340]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffc0d91030 a2=0 a3=1 items=0 ppid=3036 pid=3340 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:01.697000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:01.703000 audit[3340]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3340 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:01.703000 audit[3340]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc0d91030 a2=0 a3=1 items=0 ppid=3036 pid=3340 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:01.703000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:01.727559 systemd[1]: Created slice kubepods-besteffort-pod199bece0_0cfc_422f_88ca_a554531076a4.slice - libcontainer container kubepods-besteffort-pod199bece0_0cfc_422f_88ca_a554531076a4.slice. Jan 15 00:23:01.796508 kubelet[2931]: I0115 00:23:01.796384 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199bece0-0cfc-422f-88ca-a554531076a4-tigera-ca-bundle\") pod \"calico-typha-8685c9dd54-sf7ft\" (UID: \"199bece0-0cfc-422f-88ca-a554531076a4\") " pod="calico-system/calico-typha-8685c9dd54-sf7ft" Jan 15 00:23:01.796508 kubelet[2931]: I0115 00:23:01.796446 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54t6k\" (UniqueName: \"kubernetes.io/projected/199bece0-0cfc-422f-88ca-a554531076a4-kube-api-access-54t6k\") pod \"calico-typha-8685c9dd54-sf7ft\" (UID: \"199bece0-0cfc-422f-88ca-a554531076a4\") " pod="calico-system/calico-typha-8685c9dd54-sf7ft" Jan 15 00:23:01.796508 kubelet[2931]: I0115 00:23:01.796466 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/199bece0-0cfc-422f-88ca-a554531076a4-typha-certs\") pod \"calico-typha-8685c9dd54-sf7ft\" (UID: \"199bece0-0cfc-422f-88ca-a554531076a4\") " pod="calico-system/calico-typha-8685c9dd54-sf7ft" Jan 15 00:23:01.915723 systemd[1]: Created slice kubepods-besteffort-pod4739e896_077b_404b_9a83_70804e7ca3ee.slice - libcontainer container kubepods-besteffort-pod4739e896_077b_404b_9a83_70804e7ca3ee.slice. Jan 15 00:23:01.998324 kubelet[2931]: I0115 00:23:01.998036 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh7wn\" (UniqueName: \"kubernetes.io/projected/4739e896-077b-404b-9a83-70804e7ca3ee-kube-api-access-jh7wn\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998324 kubelet[2931]: I0115 00:23:01.998090 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4739e896-077b-404b-9a83-70804e7ca3ee-node-certs\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998324 kubelet[2931]: I0115 00:23:01.998113 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4739e896-077b-404b-9a83-70804e7ca3ee-var-lib-calico\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998324 kubelet[2931]: I0115 00:23:01.998130 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4739e896-077b-404b-9a83-70804e7ca3ee-cni-log-dir\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998324 kubelet[2931]: I0115 00:23:01.998163 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4739e896-077b-404b-9a83-70804e7ca3ee-lib-modules\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998792 kubelet[2931]: I0115 00:23:01.998206 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4739e896-077b-404b-9a83-70804e7ca3ee-var-run-calico\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998792 kubelet[2931]: I0115 00:23:01.998223 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4739e896-077b-404b-9a83-70804e7ca3ee-xtables-lock\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998792 kubelet[2931]: I0115 00:23:01.998238 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4739e896-077b-404b-9a83-70804e7ca3ee-flexvol-driver-host\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998792 kubelet[2931]: I0115 00:23:01.998473 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4739e896-077b-404b-9a83-70804e7ca3ee-tigera-ca-bundle\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998792 kubelet[2931]: I0115 00:23:01.998562 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4739e896-077b-404b-9a83-70804e7ca3ee-cni-net-dir\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998926 kubelet[2931]: I0115 00:23:01.998587 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4739e896-077b-404b-9a83-70804e7ca3ee-cni-bin-dir\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:01.998926 kubelet[2931]: I0115 00:23:01.998601 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4739e896-077b-404b-9a83-70804e7ca3ee-policysync\") pod \"calico-node-rh7w5\" (UID: \"4739e896-077b-404b-9a83-70804e7ca3ee\") " pod="calico-system/calico-node-rh7w5" Jan 15 00:23:02.032165 containerd[1703]: time="2026-01-15T00:23:02.032116271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8685c9dd54-sf7ft,Uid:199bece0-0cfc-422f-88ca-a554531076a4,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:02.053075 containerd[1703]: time="2026-01-15T00:23:02.053017415Z" level=info msg="connecting to shim 5d26872dffc67f8a3885c2f6f881025649956d123c9f56e5f65f74713df04853" address="unix:///run/containerd/s/870ced865431b981afb963f57a300398876feae5f2aa509490bec17efc3cc78c" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:02.079418 systemd[1]: Started cri-containerd-5d26872dffc67f8a3885c2f6f881025649956d123c9f56e5f65f74713df04853.scope - libcontainer container 5d26872dffc67f8a3885c2f6f881025649956d123c9f56e5f65f74713df04853. Jan 15 00:23:02.103125 kubelet[2931]: E0115 00:23:02.103080 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.103125 kubelet[2931]: W0115 00:23:02.103117 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.103296 kubelet[2931]: E0115 00:23:02.103138 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.103439 kubelet[2931]: E0115 00:23:02.103420 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.103439 kubelet[2931]: W0115 00:23:02.103435 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.103502 kubelet[2931]: E0115 00:23:02.103447 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.103662 kubelet[2931]: E0115 00:23:02.103647 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.103662 kubelet[2931]: W0115 00:23:02.103660 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.103718 kubelet[2931]: E0115 00:23:02.103674 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.104107 kubelet[2931]: E0115 00:23:02.104082 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.104107 kubelet[2931]: W0115 00:23:02.104097 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.104216 kubelet[2931]: E0115 00:23:02.104111 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.104866 kubelet[2931]: E0115 00:23:02.104451 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:02.104866 kubelet[2931]: E0115 00:23:02.104603 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.104866 kubelet[2931]: W0115 00:23:02.104615 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.104866 kubelet[2931]: E0115 00:23:02.104638 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.104866 kubelet[2931]: E0115 00:23:02.104861 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.104866 kubelet[2931]: W0115 00:23:02.104875 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.105041 kubelet[2931]: E0115 00:23:02.104886 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.105681 kubelet[2931]: E0115 00:23:02.105087 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.105681 kubelet[2931]: W0115 00:23:02.105102 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.105681 kubelet[2931]: E0115 00:23:02.105112 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.105681 kubelet[2931]: E0115 00:23:02.105348 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.105681 kubelet[2931]: W0115 00:23:02.105360 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.105681 kubelet[2931]: E0115 00:23:02.105378 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.105843 kubelet[2931]: E0115 00:23:02.105754 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.105843 kubelet[2931]: W0115 00:23:02.105772 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.105843 kubelet[2931]: E0115 00:23:02.105795 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.113167 kubelet[2931]: E0115 00:23:02.113131 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.113167 kubelet[2931]: W0115 00:23:02.113154 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.113309 kubelet[2931]: E0115 00:23:02.113191 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.113652 kubelet[2931]: E0115 00:23:02.113628 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.113652 kubelet[2931]: W0115 00:23:02.113643 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.113714 kubelet[2931]: E0115 00:23:02.113657 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.120000 audit: BPF prog-id=151 op=LOAD Jan 15 00:23:02.120000 audit: BPF prog-id=152 op=LOAD Jan 15 00:23:02.120000 audit[3364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=3351 pid=3364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.120000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564323638373264666663363766386133383835633266366638383130 Jan 15 00:23:02.121000 audit: BPF prog-id=152 op=UNLOAD Jan 15 00:23:02.121000 audit[3364]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3351 pid=3364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.121000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564323638373264666663363766386133383835633266366638383130 Jan 15 00:23:02.122000 audit: BPF prog-id=153 op=LOAD Jan 15 00:23:02.122000 audit[3364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3351 pid=3364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.122000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564323638373264666663363766386133383835633266366638383130 Jan 15 00:23:02.122000 audit: BPF prog-id=154 op=LOAD Jan 15 00:23:02.122000 audit[3364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3351 pid=3364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.122000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564323638373264666663363766386133383835633266366638383130 Jan 15 00:23:02.122000 audit: BPF prog-id=154 op=UNLOAD Jan 15 00:23:02.122000 audit[3364]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3351 pid=3364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.122000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564323638373264666663363766386133383835633266366638383130 Jan 15 00:23:02.122000 audit: BPF prog-id=153 op=UNLOAD Jan 15 00:23:02.122000 audit[3364]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3351 pid=3364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.122000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564323638373264666663363766386133383835633266366638383130 Jan 15 00:23:02.122000 audit: BPF prog-id=155 op=LOAD Jan 15 00:23:02.122000 audit[3364]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3351 pid=3364 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.122000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564323638373264666663363766386133383835633266366638383130 Jan 15 00:23:02.128048 kubelet[2931]: E0115 00:23:02.128018 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.128048 kubelet[2931]: W0115 00:23:02.128037 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.128195 kubelet[2931]: E0115 00:23:02.128057 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.154863 containerd[1703]: time="2026-01-15T00:23:02.154821887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8685c9dd54-sf7ft,Uid:199bece0-0cfc-422f-88ca-a554531076a4,Namespace:calico-system,Attempt:0,} returns sandbox id \"5d26872dffc67f8a3885c2f6f881025649956d123c9f56e5f65f74713df04853\"" Jan 15 00:23:02.156693 containerd[1703]: time="2026-01-15T00:23:02.156659212Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 15 00:23:02.190453 kubelet[2931]: E0115 00:23:02.190414 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.190453 kubelet[2931]: W0115 00:23:02.190441 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.190453 kubelet[2931]: E0115 00:23:02.190464 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.190662 kubelet[2931]: E0115 00:23:02.190647 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.190714 kubelet[2931]: W0115 00:23:02.190658 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.190714 kubelet[2931]: E0115 00:23:02.190702 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.190881 kubelet[2931]: E0115 00:23:02.190852 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.190881 kubelet[2931]: W0115 00:23:02.190867 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.190881 kubelet[2931]: E0115 00:23:02.190875 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.191018 kubelet[2931]: E0115 00:23:02.191001 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.191018 kubelet[2931]: W0115 00:23:02.191007 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.191018 kubelet[2931]: E0115 00:23:02.191015 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.191176 kubelet[2931]: E0115 00:23:02.191150 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.191176 kubelet[2931]: W0115 00:23:02.191161 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.191230 kubelet[2931]: E0115 00:23:02.191169 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.191410 kubelet[2931]: E0115 00:23:02.191394 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.191410 kubelet[2931]: W0115 00:23:02.191408 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.191464 kubelet[2931]: E0115 00:23:02.191421 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.191566 kubelet[2931]: E0115 00:23:02.191553 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.191566 kubelet[2931]: W0115 00:23:02.191564 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.191617 kubelet[2931]: E0115 00:23:02.191572 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.191704 kubelet[2931]: E0115 00:23:02.191694 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.191704 kubelet[2931]: W0115 00:23:02.191703 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.191746 kubelet[2931]: E0115 00:23:02.191711 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.191850 kubelet[2931]: E0115 00:23:02.191840 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.191874 kubelet[2931]: W0115 00:23:02.191849 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.191874 kubelet[2931]: E0115 00:23:02.191857 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.191980 kubelet[2931]: E0115 00:23:02.191970 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.192005 kubelet[2931]: W0115 00:23:02.191979 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.192005 kubelet[2931]: E0115 00:23:02.191987 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.192109 kubelet[2931]: E0115 00:23:02.192099 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.192133 kubelet[2931]: W0115 00:23:02.192109 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.192133 kubelet[2931]: E0115 00:23:02.192117 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.192274 kubelet[2931]: E0115 00:23:02.192263 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.192274 kubelet[2931]: W0115 00:23:02.192273 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.192323 kubelet[2931]: E0115 00:23:02.192283 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.192432 kubelet[2931]: E0115 00:23:02.192421 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.192456 kubelet[2931]: W0115 00:23:02.192431 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.192456 kubelet[2931]: E0115 00:23:02.192440 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.192572 kubelet[2931]: E0115 00:23:02.192562 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.192594 kubelet[2931]: W0115 00:23:02.192571 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.192594 kubelet[2931]: E0115 00:23:02.192580 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.192723 kubelet[2931]: E0115 00:23:02.192710 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.192723 kubelet[2931]: W0115 00:23:02.192721 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.192768 kubelet[2931]: E0115 00:23:02.192732 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.192867 kubelet[2931]: E0115 00:23:02.192855 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.192867 kubelet[2931]: W0115 00:23:02.192865 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.192909 kubelet[2931]: E0115 00:23:02.192873 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.193014 kubelet[2931]: E0115 00:23:02.193004 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.193038 kubelet[2931]: W0115 00:23:02.193013 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.193038 kubelet[2931]: E0115 00:23:02.193025 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.193148 kubelet[2931]: E0115 00:23:02.193138 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.193191 kubelet[2931]: W0115 00:23:02.193147 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.193191 kubelet[2931]: E0115 00:23:02.193155 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.193297 kubelet[2931]: E0115 00:23:02.193285 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.193297 kubelet[2931]: W0115 00:23:02.193295 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.193342 kubelet[2931]: E0115 00:23:02.193303 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.193431 kubelet[2931]: E0115 00:23:02.193421 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.193455 kubelet[2931]: W0115 00:23:02.193430 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.193455 kubelet[2931]: E0115 00:23:02.193438 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.200846 kubelet[2931]: E0115 00:23:02.200782 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.200846 kubelet[2931]: W0115 00:23:02.200800 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.200846 kubelet[2931]: E0115 00:23:02.200813 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.200846 kubelet[2931]: I0115 00:23:02.200835 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e8af8aef-db47-4bb0-9303-531f44a2593e-registration-dir\") pod \"csi-node-driver-92nsn\" (UID: \"e8af8aef-db47-4bb0-9303-531f44a2593e\") " pod="calico-system/csi-node-driver-92nsn" Jan 15 00:23:02.201151 kubelet[2931]: E0115 00:23:02.200977 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.201151 kubelet[2931]: W0115 00:23:02.200986 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.201151 kubelet[2931]: E0115 00:23:02.201007 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.201151 kubelet[2931]: I0115 00:23:02.201021 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e8af8aef-db47-4bb0-9303-531f44a2593e-socket-dir\") pod \"csi-node-driver-92nsn\" (UID: \"e8af8aef-db47-4bb0-9303-531f44a2593e\") " pod="calico-system/csi-node-driver-92nsn" Jan 15 00:23:02.201151 kubelet[2931]: E0115 00:23:02.201154 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.201293 kubelet[2931]: W0115 00:23:02.201163 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.201293 kubelet[2931]: E0115 00:23:02.201182 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.201293 kubelet[2931]: I0115 00:23:02.201197 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8sb\" (UniqueName: \"kubernetes.io/projected/e8af8aef-db47-4bb0-9303-531f44a2593e-kube-api-access-cx8sb\") pod \"csi-node-driver-92nsn\" (UID: \"e8af8aef-db47-4bb0-9303-531f44a2593e\") " pod="calico-system/csi-node-driver-92nsn" Jan 15 00:23:02.201354 kubelet[2931]: E0115 00:23:02.201334 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.201354 kubelet[2931]: W0115 00:23:02.201342 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.201354 kubelet[2931]: E0115 00:23:02.201351 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.201414 kubelet[2931]: I0115 00:23:02.201364 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e8af8aef-db47-4bb0-9303-531f44a2593e-kubelet-dir\") pod \"csi-node-driver-92nsn\" (UID: \"e8af8aef-db47-4bb0-9303-531f44a2593e\") " pod="calico-system/csi-node-driver-92nsn" Jan 15 00:23:02.201509 kubelet[2931]: E0115 00:23:02.201487 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.201509 kubelet[2931]: W0115 00:23:02.201499 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.201509 kubelet[2931]: E0115 00:23:02.201513 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.201509 kubelet[2931]: I0115 00:23:02.201526 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e8af8aef-db47-4bb0-9303-531f44a2593e-varrun\") pod \"csi-node-driver-92nsn\" (UID: \"e8af8aef-db47-4bb0-9303-531f44a2593e\") " pod="calico-system/csi-node-driver-92nsn" Jan 15 00:23:02.202114 kubelet[2931]: E0115 00:23:02.201732 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.202114 kubelet[2931]: W0115 00:23:02.201745 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.202114 kubelet[2931]: E0115 00:23:02.201765 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.202114 kubelet[2931]: E0115 00:23:02.201934 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.202114 kubelet[2931]: W0115 00:23:02.201943 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.202114 kubelet[2931]: E0115 00:23:02.201960 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.202333 kubelet[2931]: E0115 00:23:02.202319 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.202388 kubelet[2931]: W0115 00:23:02.202377 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.202449 kubelet[2931]: E0115 00:23:02.202438 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.202647 kubelet[2931]: E0115 00:23:02.202632 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.202795 kubelet[2931]: W0115 00:23:02.202705 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.202795 kubelet[2931]: E0115 00:23:02.202738 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.202943 kubelet[2931]: E0115 00:23:02.202929 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.203000 kubelet[2931]: W0115 00:23:02.202988 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.203074 kubelet[2931]: E0115 00:23:02.203055 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.203271 kubelet[2931]: E0115 00:23:02.203258 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.203347 kubelet[2931]: W0115 00:23:02.203335 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.203480 kubelet[2931]: E0115 00:23:02.203462 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.203748 kubelet[2931]: E0115 00:23:02.203658 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.203748 kubelet[2931]: W0115 00:23:02.203671 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.203748 kubelet[2931]: E0115 00:23:02.203694 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.203912 kubelet[2931]: E0115 00:23:02.203898 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.203972 kubelet[2931]: W0115 00:23:02.203961 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.204025 kubelet[2931]: E0115 00:23:02.204015 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.204347 kubelet[2931]: E0115 00:23:02.204234 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.204347 kubelet[2931]: W0115 00:23:02.204249 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.204347 kubelet[2931]: E0115 00:23:02.204260 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.204521 kubelet[2931]: E0115 00:23:02.204508 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.204582 kubelet[2931]: W0115 00:23:02.204571 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.204650 kubelet[2931]: E0115 00:23:02.204638 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.221237 containerd[1703]: time="2026-01-15T00:23:02.221170250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rh7w5,Uid:4739e896-077b-404b-9a83-70804e7ca3ee,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:02.253956 containerd[1703]: time="2026-01-15T00:23:02.253737389Z" level=info msg="connecting to shim d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e" address="unix:///run/containerd/s/02da95c7790ec6d92d44d4add68da87d8f7286b5aa506bb12dc72ccb99e74722" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:02.278500 systemd[1]: Started cri-containerd-d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e.scope - libcontainer container d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e. Jan 15 00:23:02.291000 audit: BPF prog-id=156 op=LOAD Jan 15 00:23:02.292000 audit: BPF prog-id=157 op=LOAD Jan 15 00:23:02.292000 audit[3468]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=3456 pid=3468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436616532626361643061636164323138323264356536383538313538 Jan 15 00:23:02.292000 audit: BPF prog-id=157 op=UNLOAD Jan 15 00:23:02.292000 audit[3468]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=3468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436616532626361643061636164323138323264356536383538313538 Jan 15 00:23:02.292000 audit: BPF prog-id=158 op=LOAD Jan 15 00:23:02.292000 audit[3468]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3456 pid=3468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436616532626361643061636164323138323264356536383538313538 Jan 15 00:23:02.292000 audit: BPF prog-id=159 op=LOAD Jan 15 00:23:02.292000 audit[3468]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3456 pid=3468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436616532626361643061636164323138323264356536383538313538 Jan 15 00:23:02.292000 audit: BPF prog-id=159 op=UNLOAD Jan 15 00:23:02.292000 audit[3468]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=3468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436616532626361643061636164323138323264356536383538313538 Jan 15 00:23:02.292000 audit: BPF prog-id=158 op=UNLOAD Jan 15 00:23:02.292000 audit[3468]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=3468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436616532626361643061636164323138323264356536383538313538 Jan 15 00:23:02.292000 audit: BPF prog-id=160 op=LOAD Jan 15 00:23:02.292000 audit[3468]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3456 pid=3468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.292000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436616532626361643061636164323138323264356536383538313538 Jan 15 00:23:02.302591 kubelet[2931]: E0115 00:23:02.302550 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.302591 kubelet[2931]: W0115 00:23:02.302581 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.302591 kubelet[2931]: E0115 00:23:02.302601 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.302879 kubelet[2931]: E0115 00:23:02.302843 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.302879 kubelet[2931]: W0115 00:23:02.302864 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.302929 kubelet[2931]: E0115 00:23:02.302881 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.303165 kubelet[2931]: E0115 00:23:02.303150 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.303233 kubelet[2931]: W0115 00:23:02.303165 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.303233 kubelet[2931]: E0115 00:23:02.303203 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.303560 kubelet[2931]: E0115 00:23:02.303541 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.303560 kubelet[2931]: W0115 00:23:02.303555 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.303694 kubelet[2931]: E0115 00:23:02.303578 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.304244 kubelet[2931]: E0115 00:23:02.304229 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.304244 kubelet[2931]: W0115 00:23:02.304242 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.304335 kubelet[2931]: E0115 00:23:02.304270 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.304487 kubelet[2931]: E0115 00:23:02.304467 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.304533 kubelet[2931]: W0115 00:23:02.304486 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.304533 kubelet[2931]: E0115 00:23:02.304503 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.305790 containerd[1703]: time="2026-01-15T00:23:02.305757348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rh7w5,Uid:4739e896-077b-404b-9a83-70804e7ca3ee,Namespace:calico-system,Attempt:0,} returns sandbox id \"d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e\"" Jan 15 00:23:02.306721 kubelet[2931]: E0115 00:23:02.306517 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.307229 kubelet[2931]: W0115 00:23:02.306822 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.307229 kubelet[2931]: E0115 00:23:02.306950 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.307727 kubelet[2931]: E0115 00:23:02.307597 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.307950 kubelet[2931]: W0115 00:23:02.307820 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.307950 kubelet[2931]: E0115 00:23:02.307893 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.308112 kubelet[2931]: E0115 00:23:02.308070 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.308112 kubelet[2931]: W0115 00:23:02.308083 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.308369 kubelet[2931]: E0115 00:23:02.308231 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.308569 kubelet[2931]: E0115 00:23:02.308550 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.308648 kubelet[2931]: W0115 00:23:02.308568 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.308702 kubelet[2931]: E0115 00:23:02.308666 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.308888 kubelet[2931]: E0115 00:23:02.308873 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.308923 kubelet[2931]: W0115 00:23:02.308888 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.308923 kubelet[2931]: E0115 00:23:02.308917 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.309078 kubelet[2931]: E0115 00:23:02.309066 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.309078 kubelet[2931]: W0115 00:23:02.309076 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.309134 kubelet[2931]: E0115 00:23:02.309100 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.309249 kubelet[2931]: E0115 00:23:02.309238 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.309249 kubelet[2931]: W0115 00:23:02.309249 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.309341 kubelet[2931]: E0115 00:23:02.309273 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.309718 kubelet[2931]: E0115 00:23:02.309690 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.309718 kubelet[2931]: W0115 00:23:02.309710 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.310397 kubelet[2931]: E0115 00:23:02.309737 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.310397 kubelet[2931]: E0115 00:23:02.309910 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.310397 kubelet[2931]: W0115 00:23:02.309920 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.310397 kubelet[2931]: E0115 00:23:02.309943 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.310397 kubelet[2931]: E0115 00:23:02.310103 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.310397 kubelet[2931]: W0115 00:23:02.310113 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.310397 kubelet[2931]: E0115 00:23:02.310222 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.310397 kubelet[2931]: E0115 00:23:02.310354 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.310397 kubelet[2931]: W0115 00:23:02.310365 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.310397 kubelet[2931]: E0115 00:23:02.310378 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.310602 kubelet[2931]: E0115 00:23:02.310532 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.310602 kubelet[2931]: W0115 00:23:02.310540 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.310602 kubelet[2931]: E0115 00:23:02.310550 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.311078 kubelet[2931]: E0115 00:23:02.311050 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.311078 kubelet[2931]: W0115 00:23:02.311071 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.311185 kubelet[2931]: E0115 00:23:02.311099 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.311248 kubelet[2931]: E0115 00:23:02.311235 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.311248 kubelet[2931]: W0115 00:23:02.311245 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.311355 kubelet[2931]: E0115 00:23:02.311278 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.311447 kubelet[2931]: E0115 00:23:02.311370 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.311447 kubelet[2931]: W0115 00:23:02.311377 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.311447 kubelet[2931]: E0115 00:23:02.311407 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.311538 kubelet[2931]: E0115 00:23:02.311523 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.311538 kubelet[2931]: W0115 00:23:02.311533 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.311606 kubelet[2931]: E0115 00:23:02.311589 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.311697 kubelet[2931]: E0115 00:23:02.311686 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.311697 kubelet[2931]: W0115 00:23:02.311696 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.311745 kubelet[2931]: E0115 00:23:02.311710 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.311979 kubelet[2931]: E0115 00:23:02.311965 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.312010 kubelet[2931]: W0115 00:23:02.311979 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.312010 kubelet[2931]: E0115 00:23:02.311999 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.312209 kubelet[2931]: E0115 00:23:02.312194 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.312209 kubelet[2931]: W0115 00:23:02.312207 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.312273 kubelet[2931]: E0115 00:23:02.312216 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.322267 kubelet[2931]: E0115 00:23:02.322241 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:02.322267 kubelet[2931]: W0115 00:23:02.322263 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:02.322356 kubelet[2931]: E0115 00:23:02.322281 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:02.719000 audit[3521]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=3521 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:02.719000 audit[3521]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffedd70e20 a2=0 a3=1 items=0 ppid=3036 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.719000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:02.725000 audit[3521]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=3521 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:02.725000 audit[3521]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffedd70e20 a2=0 a3=1 items=0 ppid=3036 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:02.725000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:03.591941 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3363472580.mount: Deactivated successfully. Jan 15 00:23:03.872111 kubelet[2931]: E0115 00:23:03.871959 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:05.872352 kubelet[2931]: E0115 00:23:05.872252 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:07.872185 kubelet[2931]: E0115 00:23:07.872109 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:09.872532 kubelet[2931]: E0115 00:23:09.872404 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:11.872067 kubelet[2931]: E0115 00:23:11.871998 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:13.872273 kubelet[2931]: E0115 00:23:13.872221 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:15.871910 kubelet[2931]: E0115 00:23:15.871849 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:17.872311 kubelet[2931]: E0115 00:23:17.872156 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:18.211705 containerd[1703]: time="2026-01-15T00:23:18.211496599Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:18.213304 containerd[1703]: time="2026-01-15T00:23:18.213257485Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Jan 15 00:23:18.214497 containerd[1703]: time="2026-01-15T00:23:18.214470528Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:18.216688 containerd[1703]: time="2026-01-15T00:23:18.216656095Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:18.217207 containerd[1703]: time="2026-01-15T00:23:18.217164137Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 16.060463324s" Jan 15 00:23:18.217264 containerd[1703]: time="2026-01-15T00:23:18.217209497Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 15 00:23:18.218475 containerd[1703]: time="2026-01-15T00:23:18.218297820Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 15 00:23:18.228308 containerd[1703]: time="2026-01-15T00:23:18.228236530Z" level=info msg="CreateContainer within sandbox \"5d26872dffc67f8a3885c2f6f881025649956d123c9f56e5f65f74713df04853\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 15 00:23:18.239406 containerd[1703]: time="2026-01-15T00:23:18.239359124Z" level=info msg="Container ec0a98d515673948bef6d6782c87624d365ea7ad3486de8b7b92440eaf6b64dc: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:23:18.248038 containerd[1703]: time="2026-01-15T00:23:18.247973111Z" level=info msg="CreateContainer within sandbox \"5d26872dffc67f8a3885c2f6f881025649956d123c9f56e5f65f74713df04853\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ec0a98d515673948bef6d6782c87624d365ea7ad3486de8b7b92440eaf6b64dc\"" Jan 15 00:23:18.249004 containerd[1703]: time="2026-01-15T00:23:18.248975674Z" level=info msg="StartContainer for \"ec0a98d515673948bef6d6782c87624d365ea7ad3486de8b7b92440eaf6b64dc\"" Jan 15 00:23:18.250234 containerd[1703]: time="2026-01-15T00:23:18.250202918Z" level=info msg="connecting to shim ec0a98d515673948bef6d6782c87624d365ea7ad3486de8b7b92440eaf6b64dc" address="unix:///run/containerd/s/870ced865431b981afb963f57a300398876feae5f2aa509490bec17efc3cc78c" protocol=ttrpc version=3 Jan 15 00:23:18.270355 systemd[1]: Started cri-containerd-ec0a98d515673948bef6d6782c87624d365ea7ad3486de8b7b92440eaf6b64dc.scope - libcontainer container ec0a98d515673948bef6d6782c87624d365ea7ad3486de8b7b92440eaf6b64dc. Jan 15 00:23:18.281000 audit: BPF prog-id=161 op=LOAD Jan 15 00:23:18.282803 kernel: kauditd_printk_skb: 64 callbacks suppressed Jan 15 00:23:18.282876 kernel: audit: type=1334 audit(1768436598.281:562): prog-id=161 op=LOAD Jan 15 00:23:18.282900 kernel: audit: type=1334 audit(1768436598.281:563): prog-id=162 op=LOAD Jan 15 00:23:18.281000 audit: BPF prog-id=162 op=LOAD Jan 15 00:23:18.281000 audit[3537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.287159 kernel: audit: type=1300 audit(1768436598.281:563): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.287223 kernel: audit: type=1327 audit(1768436598.281:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.281000 audit: BPF prog-id=162 op=UNLOAD Jan 15 00:23:18.291371 kernel: audit: type=1334 audit(1768436598.281:564): prog-id=162 op=UNLOAD Jan 15 00:23:18.291414 kernel: audit: type=1300 audit(1768436598.281:564): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.281000 audit[3537]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.298033 kernel: audit: type=1327 audit(1768436598.281:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.298099 kernel: audit: type=1334 audit(1768436598.281:565): prog-id=163 op=LOAD Jan 15 00:23:18.281000 audit: BPF prog-id=163 op=LOAD Jan 15 00:23:18.281000 audit[3537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.302585 kernel: audit: type=1300 audit(1768436598.281:565): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.302666 kernel: audit: type=1327 audit(1768436598.281:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.282000 audit: BPF prog-id=164 op=LOAD Jan 15 00:23:18.282000 audit[3537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.282000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.283000 audit: BPF prog-id=164 op=UNLOAD Jan 15 00:23:18.283000 audit[3537]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.283000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.283000 audit: BPF prog-id=163 op=UNLOAD Jan 15 00:23:18.283000 audit[3537]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.283000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.283000 audit: BPF prog-id=165 op=LOAD Jan 15 00:23:18.283000 audit[3537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3351 pid=3537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:18.283000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563306139386435313536373339343862656636643637383263383736 Jan 15 00:23:18.328328 containerd[1703]: time="2026-01-15T00:23:18.328284476Z" level=info msg="StartContainer for \"ec0a98d515673948bef6d6782c87624d365ea7ad3486de8b7b92440eaf6b64dc\" returns successfully" Jan 15 00:23:18.998409 kubelet[2931]: E0115 00:23:18.998359 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.998409 kubelet[2931]: W0115 00:23:18.998393 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.998409 kubelet[2931]: E0115 00:23:18.998413 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.998827 kubelet[2931]: E0115 00:23:18.998555 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.998827 kubelet[2931]: W0115 00:23:18.998563 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.998827 kubelet[2931]: E0115 00:23:18.998595 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.998827 kubelet[2931]: E0115 00:23:18.998728 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.998827 kubelet[2931]: W0115 00:23:18.998735 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.998827 kubelet[2931]: E0115 00:23:18.998743 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.998954 kubelet[2931]: E0115 00:23:18.998869 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.998954 kubelet[2931]: W0115 00:23:18.998876 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.998954 kubelet[2931]: E0115 00:23:18.998884 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.999046 kubelet[2931]: E0115 00:23:18.999020 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.999046 kubelet[2931]: W0115 00:23:18.999037 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.999089 kubelet[2931]: E0115 00:23:18.999045 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.999192 kubelet[2931]: E0115 00:23:18.999165 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.999226 kubelet[2931]: W0115 00:23:18.999195 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.999226 kubelet[2931]: E0115 00:23:18.999203 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.999335 kubelet[2931]: E0115 00:23:18.999325 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.999360 kubelet[2931]: W0115 00:23:18.999334 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.999360 kubelet[2931]: E0115 00:23:18.999342 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.999474 kubelet[2931]: E0115 00:23:18.999464 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.999501 kubelet[2931]: W0115 00:23:18.999477 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.999501 kubelet[2931]: E0115 00:23:18.999485 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.999662 kubelet[2931]: E0115 00:23:18.999650 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.999662 kubelet[2931]: W0115 00:23:18.999660 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.999714 kubelet[2931]: E0115 00:23:18.999668 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.999796 kubelet[2931]: E0115 00:23:18.999784 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.999828 kubelet[2931]: W0115 00:23:18.999797 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.999828 kubelet[2931]: E0115 00:23:18.999806 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:18.999939 kubelet[2931]: E0115 00:23:18.999929 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:18.999962 kubelet[2931]: W0115 00:23:18.999939 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:18.999962 kubelet[2931]: E0115 00:23:18.999946 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.000071 kubelet[2931]: E0115 00:23:19.000062 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.000099 kubelet[2931]: W0115 00:23:19.000071 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.000099 kubelet[2931]: E0115 00:23:19.000078 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.000233 kubelet[2931]: E0115 00:23:19.000223 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.000265 kubelet[2931]: W0115 00:23:19.000232 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.000265 kubelet[2931]: E0115 00:23:19.000240 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.000370 kubelet[2931]: E0115 00:23:19.000359 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.000397 kubelet[2931]: W0115 00:23:19.000369 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.000397 kubelet[2931]: E0115 00:23:19.000382 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.000530 kubelet[2931]: E0115 00:23:19.000519 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.000530 kubelet[2931]: W0115 00:23:19.000528 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.000581 kubelet[2931]: E0115 00:23:19.000535 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.026049 kubelet[2931]: E0115 00:23:19.025978 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.026049 kubelet[2931]: W0115 00:23:19.026036 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.026189 kubelet[2931]: E0115 00:23:19.026058 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.026330 kubelet[2931]: E0115 00:23:19.026303 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.026330 kubelet[2931]: W0115 00:23:19.026319 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.026383 kubelet[2931]: E0115 00:23:19.026335 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.026556 kubelet[2931]: E0115 00:23:19.026530 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.026556 kubelet[2931]: W0115 00:23:19.026543 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.026617 kubelet[2931]: E0115 00:23:19.026559 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.026815 kubelet[2931]: E0115 00:23:19.026772 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.026815 kubelet[2931]: W0115 00:23:19.026784 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.026815 kubelet[2931]: E0115 00:23:19.026800 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.026946 kubelet[2931]: E0115 00:23:19.026934 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.026946 kubelet[2931]: W0115 00:23:19.026944 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.027004 kubelet[2931]: E0115 00:23:19.026957 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.027102 kubelet[2931]: E0115 00:23:19.027091 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.027126 kubelet[2931]: W0115 00:23:19.027103 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.027126 kubelet[2931]: E0115 00:23:19.027116 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.027306 kubelet[2931]: E0115 00:23:19.027293 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.027306 kubelet[2931]: W0115 00:23:19.027304 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.027370 kubelet[2931]: E0115 00:23:19.027327 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.027450 kubelet[2931]: E0115 00:23:19.027437 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.027450 kubelet[2931]: W0115 00:23:19.027447 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.027514 kubelet[2931]: E0115 00:23:19.027491 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.027587 kubelet[2931]: E0115 00:23:19.027577 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.027615 kubelet[2931]: W0115 00:23:19.027587 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.027615 kubelet[2931]: E0115 00:23:19.027601 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.027756 kubelet[2931]: E0115 00:23:19.027744 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.027756 kubelet[2931]: W0115 00:23:19.027755 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.027813 kubelet[2931]: E0115 00:23:19.027768 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.027908 kubelet[2931]: E0115 00:23:19.027897 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.027908 kubelet[2931]: W0115 00:23:19.027906 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.027958 kubelet[2931]: E0115 00:23:19.027919 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.028139 kubelet[2931]: E0115 00:23:19.028124 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.028139 kubelet[2931]: W0115 00:23:19.028138 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.028214 kubelet[2931]: E0115 00:23:19.028157 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.028426 kubelet[2931]: E0115 00:23:19.028407 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.028426 kubelet[2931]: W0115 00:23:19.028425 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.028483 kubelet[2931]: E0115 00:23:19.028443 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.028607 kubelet[2931]: E0115 00:23:19.028596 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.028646 kubelet[2931]: W0115 00:23:19.028609 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.028646 kubelet[2931]: E0115 00:23:19.028634 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.028779 kubelet[2931]: E0115 00:23:19.028767 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.028779 kubelet[2931]: W0115 00:23:19.028777 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.028837 kubelet[2931]: E0115 00:23:19.028790 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.028960 kubelet[2931]: E0115 00:23:19.028948 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.028960 kubelet[2931]: W0115 00:23:19.028959 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.029006 kubelet[2931]: E0115 00:23:19.028971 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.029219 kubelet[2931]: E0115 00:23:19.029203 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.029255 kubelet[2931]: W0115 00:23:19.029220 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.029255 kubelet[2931]: E0115 00:23:19.029232 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.029450 kubelet[2931]: E0115 00:23:19.029435 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:19.029475 kubelet[2931]: W0115 00:23:19.029450 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:19.029475 kubelet[2931]: E0115 00:23:19.029460 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:19.359094 kubelet[2931]: I0115 00:23:19.359013 2931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8685c9dd54-sf7ft" podStartSLOduration=2.29701878 podStartE2EDuration="18.358968509s" podCreationTimestamp="2026-01-15 00:23:01 +0000 UTC" firstStartedPulling="2026-01-15 00:23:02.156197811 +0000 UTC m=+23.373781158" lastFinishedPulling="2026-01-15 00:23:18.21814754 +0000 UTC m=+39.435730887" observedRunningTime="2026-01-15 00:23:18.995329837 +0000 UTC m=+40.212913224" watchObservedRunningTime="2026-01-15 00:23:19.358968509 +0000 UTC m=+40.576551856" Jan 15 00:23:19.368000 audit[3615]: NETFILTER_CFG table=filter:119 family=2 entries=21 op=nft_register_rule pid=3615 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:19.368000 audit[3615]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc28691a0 a2=0 a3=1 items=0 ppid=3036 pid=3615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:19.368000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:19.378000 audit[3615]: NETFILTER_CFG table=nat:120 family=2 entries=19 op=nft_register_chain pid=3615 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:19.378000 audit[3615]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffc28691a0 a2=0 a3=1 items=0 ppid=3036 pid=3615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:19.378000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:19.872412 kubelet[2931]: E0115 00:23:19.872351 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:20.007938 kubelet[2931]: E0115 00:23:20.007808 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.007938 kubelet[2931]: W0115 00:23:20.007835 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.007938 kubelet[2931]: E0115 00:23:20.007857 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.008525 kubelet[2931]: E0115 00:23:20.008390 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.008525 kubelet[2931]: W0115 00:23:20.008404 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.008525 kubelet[2931]: E0115 00:23:20.008417 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.008716 kubelet[2931]: E0115 00:23:20.008701 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.008774 kubelet[2931]: W0115 00:23:20.008763 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.008830 kubelet[2931]: E0115 00:23:20.008818 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.009160 kubelet[2931]: E0115 00:23:20.009044 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.009160 kubelet[2931]: W0115 00:23:20.009057 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.009160 kubelet[2931]: E0115 00:23:20.009070 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.009428 kubelet[2931]: E0115 00:23:20.009413 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.009483 kubelet[2931]: W0115 00:23:20.009472 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.009537 kubelet[2931]: E0115 00:23:20.009527 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.009834 kubelet[2931]: E0115 00:23:20.009740 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.009834 kubelet[2931]: W0115 00:23:20.009752 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.009834 kubelet[2931]: E0115 00:23:20.009761 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.009996 kubelet[2931]: E0115 00:23:20.009983 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.010049 kubelet[2931]: W0115 00:23:20.010039 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.010107 kubelet[2931]: E0115 00:23:20.010096 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.010433 kubelet[2931]: E0115 00:23:20.010326 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.010433 kubelet[2931]: W0115 00:23:20.010338 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.010433 kubelet[2931]: E0115 00:23:20.010348 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.010597 kubelet[2931]: E0115 00:23:20.010584 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.010652 kubelet[2931]: W0115 00:23:20.010642 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.010705 kubelet[2931]: E0115 00:23:20.010694 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.010904 kubelet[2931]: E0115 00:23:20.010891 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.010981 kubelet[2931]: W0115 00:23:20.010967 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.011036 kubelet[2931]: E0115 00:23:20.011024 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.011333 kubelet[2931]: E0115 00:23:20.011238 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.011333 kubelet[2931]: W0115 00:23:20.011249 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.011333 kubelet[2931]: E0115 00:23:20.011259 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.011491 kubelet[2931]: E0115 00:23:20.011479 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.011552 kubelet[2931]: W0115 00:23:20.011539 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.011604 kubelet[2931]: E0115 00:23:20.011594 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.011917 kubelet[2931]: E0115 00:23:20.011816 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.011917 kubelet[2931]: W0115 00:23:20.011827 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.011917 kubelet[2931]: E0115 00:23:20.011837 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.012080 kubelet[2931]: E0115 00:23:20.012068 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.012133 kubelet[2931]: W0115 00:23:20.012123 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.012195 kubelet[2931]: E0115 00:23:20.012184 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.012530 kubelet[2931]: E0115 00:23:20.012436 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.012530 kubelet[2931]: W0115 00:23:20.012449 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.012530 kubelet[2931]: E0115 00:23:20.012462 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.033745 kubelet[2931]: E0115 00:23:20.033713 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.033745 kubelet[2931]: W0115 00:23:20.033737 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.033745 kubelet[2931]: E0115 00:23:20.033756 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.034094 kubelet[2931]: E0115 00:23:20.033930 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.034094 kubelet[2931]: W0115 00:23:20.033938 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.034094 kubelet[2931]: E0115 00:23:20.033947 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.034170 kubelet[2931]: E0115 00:23:20.034116 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.034170 kubelet[2931]: W0115 00:23:20.034124 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.034170 kubelet[2931]: E0115 00:23:20.034138 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.034502 kubelet[2931]: E0115 00:23:20.034479 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.034502 kubelet[2931]: W0115 00:23:20.034496 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.034502 kubelet[2931]: E0115 00:23:20.034512 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.034766 kubelet[2931]: E0115 00:23:20.034692 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.034766 kubelet[2931]: W0115 00:23:20.034702 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.034766 kubelet[2931]: E0115 00:23:20.034716 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.035350 kubelet[2931]: E0115 00:23:20.035334 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.035350 kubelet[2931]: W0115 00:23:20.035347 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.035462 kubelet[2931]: E0115 00:23:20.035362 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.035603 kubelet[2931]: E0115 00:23:20.035543 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.035632 kubelet[2931]: W0115 00:23:20.035605 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.035663 kubelet[2931]: E0115 00:23:20.035638 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.035794 kubelet[2931]: E0115 00:23:20.035783 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.035822 kubelet[2931]: W0115 00:23:20.035795 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.035822 kubelet[2931]: E0115 00:23:20.035817 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.035954 kubelet[2931]: E0115 00:23:20.035943 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.035987 kubelet[2931]: W0115 00:23:20.035954 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.035987 kubelet[2931]: E0115 00:23:20.035969 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.036248 kubelet[2931]: E0115 00:23:20.036232 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.036294 kubelet[2931]: W0115 00:23:20.036247 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.036294 kubelet[2931]: E0115 00:23:20.036266 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.036439 kubelet[2931]: E0115 00:23:20.036425 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.036439 kubelet[2931]: W0115 00:23:20.036437 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.036652 kubelet[2931]: E0115 00:23:20.036452 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.036817 kubelet[2931]: E0115 00:23:20.036764 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.036942 kubelet[2931]: W0115 00:23:20.036872 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.036942 kubelet[2931]: E0115 00:23:20.036903 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.037089 kubelet[2931]: E0115 00:23:20.037070 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.037089 kubelet[2931]: W0115 00:23:20.037085 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.037144 kubelet[2931]: E0115 00:23:20.037101 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.037296 kubelet[2931]: E0115 00:23:20.037282 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.037296 kubelet[2931]: W0115 00:23:20.037295 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.037365 kubelet[2931]: E0115 00:23:20.037311 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.038226 kubelet[2931]: E0115 00:23:20.038209 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.038277 kubelet[2931]: W0115 00:23:20.038226 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.038277 kubelet[2931]: E0115 00:23:20.038247 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.038637 kubelet[2931]: E0115 00:23:20.038616 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.038637 kubelet[2931]: W0115 00:23:20.038633 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.038722 kubelet[2931]: E0115 00:23:20.038653 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.039047 kubelet[2931]: E0115 00:23:20.039013 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.039047 kubelet[2931]: W0115 00:23:20.039029 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.039047 kubelet[2931]: E0115 00:23:20.039048 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.039313 kubelet[2931]: E0115 00:23:20.039291 2931 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 00:23:20.039313 kubelet[2931]: W0115 00:23:20.039307 2931 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 00:23:20.039407 kubelet[2931]: E0115 00:23:20.039321 2931 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 00:23:20.072080 containerd[1703]: time="2026-01-15T00:23:20.071582769Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:20.073391 containerd[1703]: time="2026-01-15T00:23:20.073327214Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4262566" Jan 15 00:23:20.074609 containerd[1703]: time="2026-01-15T00:23:20.074574498Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:20.077529 containerd[1703]: time="2026-01-15T00:23:20.077489267Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:20.078112 containerd[1703]: time="2026-01-15T00:23:20.078071428Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.859743688s" Jan 15 00:23:20.078112 containerd[1703]: time="2026-01-15T00:23:20.078105509Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 15 00:23:20.081370 containerd[1703]: time="2026-01-15T00:23:20.081337158Z" level=info msg="CreateContainer within sandbox \"d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 15 00:23:20.091962 containerd[1703]: time="2026-01-15T00:23:20.091905831Z" level=info msg="Container 97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:23:20.103299 containerd[1703]: time="2026-01-15T00:23:20.103254865Z" level=info msg="CreateContainer within sandbox \"d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a\"" Jan 15 00:23:20.104147 containerd[1703]: time="2026-01-15T00:23:20.104119908Z" level=info msg="StartContainer for \"97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a\"" Jan 15 00:23:20.106537 containerd[1703]: time="2026-01-15T00:23:20.106489595Z" level=info msg="connecting to shim 97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a" address="unix:///run/containerd/s/02da95c7790ec6d92d44d4add68da87d8f7286b5aa506bb12dc72ccb99e74722" protocol=ttrpc version=3 Jan 15 00:23:20.130442 systemd[1]: Started cri-containerd-97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a.scope - libcontainer container 97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a. Jan 15 00:23:20.203000 audit: BPF prog-id=166 op=LOAD Jan 15 00:23:20.203000 audit[3653]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=3456 pid=3653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:20.203000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937643262316338306237343562343438386530613363363139306661 Jan 15 00:23:20.203000 audit: BPF prog-id=167 op=LOAD Jan 15 00:23:20.203000 audit[3653]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=3456 pid=3653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:20.203000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937643262316338306237343562343438386530613363363139306661 Jan 15 00:23:20.203000 audit: BPF prog-id=167 op=UNLOAD Jan 15 00:23:20.203000 audit[3653]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=3653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:20.203000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937643262316338306237343562343438386530613363363139306661 Jan 15 00:23:20.203000 audit: BPF prog-id=166 op=UNLOAD Jan 15 00:23:20.203000 audit[3653]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=3653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:20.203000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937643262316338306237343562343438386530613363363139306661 Jan 15 00:23:20.203000 audit: BPF prog-id=168 op=LOAD Jan 15 00:23:20.203000 audit[3653]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=3456 pid=3653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:20.203000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937643262316338306237343562343438386530613363363139306661 Jan 15 00:23:20.222260 containerd[1703]: time="2026-01-15T00:23:20.221464187Z" level=info msg="StartContainer for \"97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a\" returns successfully" Jan 15 00:23:20.237043 systemd[1]: cri-containerd-97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a.scope: Deactivated successfully. Jan 15 00:23:20.239423 containerd[1703]: time="2026-01-15T00:23:20.239376282Z" level=info msg="received container exit event container_id:\"97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a\" id:\"97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a\" pid:3665 exited_at:{seconds:1768436600 nanos:238946921}" Jan 15 00:23:20.241000 audit: BPF prog-id=168 op=UNLOAD Jan 15 00:23:20.260135 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a-rootfs.mount: Deactivated successfully. Jan 15 00:23:21.871526 kubelet[2931]: E0115 00:23:21.871471 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:21.990558 containerd[1703]: time="2026-01-15T00:23:21.990496878Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 15 00:23:23.871786 kubelet[2931]: E0115 00:23:23.871713 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:25.199323 containerd[1703]: time="2026-01-15T00:23:25.199091292Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:25.199869 containerd[1703]: time="2026-01-15T00:23:25.199824134Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 15 00:23:25.201512 containerd[1703]: time="2026-01-15T00:23:25.201429699Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:25.203758 containerd[1703]: time="2026-01-15T00:23:25.203690266Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:25.204731 containerd[1703]: time="2026-01-15T00:23:25.204298228Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.21375567s" Jan 15 00:23:25.204731 containerd[1703]: time="2026-01-15T00:23:25.204345228Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 15 00:23:25.207925 containerd[1703]: time="2026-01-15T00:23:25.207863279Z" level=info msg="CreateContainer within sandbox \"d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 15 00:23:25.221439 containerd[1703]: time="2026-01-15T00:23:25.220933999Z" level=info msg="Container 690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:23:25.234144 containerd[1703]: time="2026-01-15T00:23:25.234064759Z" level=info msg="CreateContainer within sandbox \"d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28\"" Jan 15 00:23:25.234811 containerd[1703]: time="2026-01-15T00:23:25.234779281Z" level=info msg="StartContainer for \"690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28\"" Jan 15 00:23:25.237636 containerd[1703]: time="2026-01-15T00:23:25.237605650Z" level=info msg="connecting to shim 690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28" address="unix:///run/containerd/s/02da95c7790ec6d92d44d4add68da87d8f7286b5aa506bb12dc72ccb99e74722" protocol=ttrpc version=3 Jan 15 00:23:25.263529 systemd[1]: Started cri-containerd-690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28.scope - libcontainer container 690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28. Jan 15 00:23:25.322000 audit: BPF prog-id=169 op=LOAD Jan 15 00:23:25.323689 kernel: kauditd_printk_skb: 34 callbacks suppressed Jan 15 00:23:25.323753 kernel: audit: type=1334 audit(1768436605.322:578): prog-id=169 op=LOAD Jan 15 00:23:25.322000 audit[3715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3456 pid=3715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:25.327903 kernel: audit: type=1300 audit(1768436605.322:578): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3456 pid=3715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:25.327964 kernel: audit: type=1327 audit(1768436605.322:578): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639303736346262666137363834653732333030613362366531656635 Jan 15 00:23:25.322000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639303736346262666137363834653732333030613362366531656635 Jan 15 00:23:25.322000 audit: BPF prog-id=170 op=LOAD Jan 15 00:23:25.322000 audit[3715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3456 pid=3715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:25.335421 kernel: audit: type=1334 audit(1768436605.322:579): prog-id=170 op=LOAD Jan 15 00:23:25.335522 kernel: audit: type=1300 audit(1768436605.322:579): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3456 pid=3715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:25.335585 kernel: audit: type=1327 audit(1768436605.322:579): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639303736346262666137363834653732333030613362366531656635 Jan 15 00:23:25.322000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639303736346262666137363834653732333030613362366531656635 Jan 15 00:23:25.323000 audit: BPF prog-id=170 op=UNLOAD Jan 15 00:23:25.339686 kernel: audit: type=1334 audit(1768436605.323:580): prog-id=170 op=UNLOAD Jan 15 00:23:25.339731 kernel: audit: type=1300 audit(1768436605.323:580): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=3715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:25.323000 audit[3715]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=3715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:25.323000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639303736346262666137363834653732333030613362366531656635 Jan 15 00:23:25.346706 kernel: audit: type=1327 audit(1768436605.323:580): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639303736346262666137363834653732333030613362366531656635 Jan 15 00:23:25.323000 audit: BPF prog-id=169 op=UNLOAD Jan 15 00:23:25.347876 kernel: audit: type=1334 audit(1768436605.323:581): prog-id=169 op=UNLOAD Jan 15 00:23:25.323000 audit[3715]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=3715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:25.323000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639303736346262666137363834653732333030613362366531656635 Jan 15 00:23:25.323000 audit: BPF prog-id=171 op=LOAD Jan 15 00:23:25.323000 audit[3715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3456 pid=3715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:25.323000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639303736346262666137363834653732333030613362366531656635 Jan 15 00:23:25.364472 containerd[1703]: time="2026-01-15T00:23:25.364433038Z" level=info msg="StartContainer for \"690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28\" returns successfully" Jan 15 00:23:25.872406 kubelet[2931]: E0115 00:23:25.872279 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:26.625868 systemd[1]: cri-containerd-690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28.scope: Deactivated successfully. Jan 15 00:23:26.626224 systemd[1]: cri-containerd-690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28.scope: Consumed 478ms CPU time, 189.3M memory peak, 165.9M written to disk. Jan 15 00:23:26.627529 containerd[1703]: time="2026-01-15T00:23:26.627478981Z" level=info msg="received container exit event container_id:\"690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28\" id:\"690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28\" pid:3728 exited_at:{seconds:1768436606 nanos:627038140}" Jan 15 00:23:26.629000 audit: BPF prog-id=171 op=UNLOAD Jan 15 00:23:26.648886 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28-rootfs.mount: Deactivated successfully. Jan 15 00:23:26.709871 kubelet[2931]: I0115 00:23:26.709830 2931 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 15 00:23:27.328519 kubelet[2931]: I0115 00:23:26.781670 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkx7m\" (UniqueName: \"kubernetes.io/projected/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-kube-api-access-bkx7m\") pod \"whisker-7585bf5dbc-745zl\" (UID: \"9b5a7ef8-11b0-402a-a15b-6cd8553f5062\") " pod="calico-system/whisker-7585bf5dbc-745zl" Jan 15 00:23:27.328519 kubelet[2931]: I0115 00:23:26.781700 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pkgc\" (UniqueName: \"kubernetes.io/projected/cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd-kube-api-access-7pkgc\") pod \"coredns-668d6bf9bc-ksjks\" (UID: \"cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd\") " pod="kube-system/coredns-668d6bf9bc-ksjks" Jan 15 00:23:27.328519 kubelet[2931]: I0115 00:23:26.781790 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/933e7fe5-e25e-48cf-938a-716b1fa3d838-calico-apiserver-certs\") pod \"calico-apiserver-d7bdbd78b-nn9k9\" (UID: \"933e7fe5-e25e-48cf-938a-716b1fa3d838\") " pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" Jan 15 00:23:27.328519 kubelet[2931]: I0115 00:23:26.781825 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/576324d0-4c45-424a-9f35-c0de23b9b1ac-config\") pod \"goldmane-666569f655-cd46w\" (UID: \"576324d0-4c45-424a-9f35-c0de23b9b1ac\") " pod="calico-system/goldmane-666569f655-cd46w" Jan 15 00:23:27.328519 kubelet[2931]: I0115 00:23:26.781862 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/051b417e-bac4-4f72-8b07-3775d126567f-tigera-ca-bundle\") pod \"calico-kube-controllers-7877d5fb5-c885v\" (UID: \"051b417e-bac4-4f72-8b07-3775d126567f\") " pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" Jan 15 00:23:26.746731 systemd[1]: Created slice kubepods-burstable-podd39e4a7d_cae8_4f75_a430_aaeae5982278.slice - libcontainer container kubepods-burstable-podd39e4a7d_cae8_4f75_a430_aaeae5982278.slice. Jan 15 00:23:27.329008 kubelet[2931]: I0115 00:23:26.781878 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xck2k\" (UniqueName: \"kubernetes.io/projected/933e7fe5-e25e-48cf-938a-716b1fa3d838-kube-api-access-xck2k\") pod \"calico-apiserver-d7bdbd78b-nn9k9\" (UID: \"933e7fe5-e25e-48cf-938a-716b1fa3d838\") " pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" Jan 15 00:23:27.329008 kubelet[2931]: I0115 00:23:26.781895 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqmw8\" (UniqueName: \"kubernetes.io/projected/576324d0-4c45-424a-9f35-c0de23b9b1ac-kube-api-access-vqmw8\") pod \"goldmane-666569f655-cd46w\" (UID: \"576324d0-4c45-424a-9f35-c0de23b9b1ac\") " pod="calico-system/goldmane-666569f655-cd46w" Jan 15 00:23:27.329008 kubelet[2931]: I0115 00:23:26.781911 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-568wc\" (UniqueName: \"kubernetes.io/projected/051b417e-bac4-4f72-8b07-3775d126567f-kube-api-access-568wc\") pod \"calico-kube-controllers-7877d5fb5-c885v\" (UID: \"051b417e-bac4-4f72-8b07-3775d126567f\") " pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" Jan 15 00:23:27.329008 kubelet[2931]: I0115 00:23:26.781934 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5q7b\" (UniqueName: \"kubernetes.io/projected/782417a5-ecd0-40c5-85c0-45ead5d347fd-kube-api-access-h5q7b\") pod \"calico-apiserver-d7bdbd78b-v4vh7\" (UID: \"782417a5-ecd0-40c5-85c0-45ead5d347fd\") " pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" Jan 15 00:23:27.329008 kubelet[2931]: I0115 00:23:26.782036 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d39e4a7d-cae8-4f75-a430-aaeae5982278-config-volume\") pod \"coredns-668d6bf9bc-67zqw\" (UID: \"d39e4a7d-cae8-4f75-a430-aaeae5982278\") " pod="kube-system/coredns-668d6bf9bc-67zqw" Jan 15 00:23:26.757160 systemd[1]: Created slice kubepods-besteffort-pod9b5a7ef8_11b0_402a_a15b_6cd8553f5062.slice - libcontainer container kubepods-besteffort-pod9b5a7ef8_11b0_402a_a15b_6cd8553f5062.slice. Jan 15 00:23:27.329162 kubelet[2931]: I0115 00:23:26.782070 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-whisker-backend-key-pair\") pod \"whisker-7585bf5dbc-745zl\" (UID: \"9b5a7ef8-11b0-402a-a15b-6cd8553f5062\") " pod="calico-system/whisker-7585bf5dbc-745zl" Jan 15 00:23:27.329162 kubelet[2931]: I0115 00:23:26.782112 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-whisker-ca-bundle\") pod \"whisker-7585bf5dbc-745zl\" (UID: \"9b5a7ef8-11b0-402a-a15b-6cd8553f5062\") " pod="calico-system/whisker-7585bf5dbc-745zl" Jan 15 00:23:27.329162 kubelet[2931]: I0115 00:23:26.782126 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd-config-volume\") pod \"coredns-668d6bf9bc-ksjks\" (UID: \"cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd\") " pod="kube-system/coredns-668d6bf9bc-ksjks" Jan 15 00:23:27.329162 kubelet[2931]: I0115 00:23:26.782141 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/576324d0-4c45-424a-9f35-c0de23b9b1ac-goldmane-key-pair\") pod \"goldmane-666569f655-cd46w\" (UID: \"576324d0-4c45-424a-9f35-c0de23b9b1ac\") " pod="calico-system/goldmane-666569f655-cd46w" Jan 15 00:23:27.329162 kubelet[2931]: I0115 00:23:26.782161 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r56wv\" (UniqueName: \"kubernetes.io/projected/d39e4a7d-cae8-4f75-a430-aaeae5982278-kube-api-access-r56wv\") pod \"coredns-668d6bf9bc-67zqw\" (UID: \"d39e4a7d-cae8-4f75-a430-aaeae5982278\") " pod="kube-system/coredns-668d6bf9bc-67zqw" Jan 15 00:23:26.764586 systemd[1]: Created slice kubepods-burstable-podcacf3b5e_3a3d_4bb1_94ee_791570a7ddfd.slice - libcontainer container kubepods-burstable-podcacf3b5e_3a3d_4bb1_94ee_791570a7ddfd.slice. Jan 15 00:23:27.330484 kubelet[2931]: I0115 00:23:26.782210 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/782417a5-ecd0-40c5-85c0-45ead5d347fd-calico-apiserver-certs\") pod \"calico-apiserver-d7bdbd78b-v4vh7\" (UID: \"782417a5-ecd0-40c5-85c0-45ead5d347fd\") " pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" Jan 15 00:23:27.330484 kubelet[2931]: I0115 00:23:26.782241 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/576324d0-4c45-424a-9f35-c0de23b9b1ac-goldmane-ca-bundle\") pod \"goldmane-666569f655-cd46w\" (UID: \"576324d0-4c45-424a-9f35-c0de23b9b1ac\") " pod="calico-system/goldmane-666569f655-cd46w" Jan 15 00:23:26.771088 systemd[1]: Created slice kubepods-besteffort-pod051b417e_bac4_4f72_8b07_3775d126567f.slice - libcontainer container kubepods-besteffort-pod051b417e_bac4_4f72_8b07_3775d126567f.slice. Jan 15 00:23:26.779373 systemd[1]: Created slice kubepods-besteffort-pod933e7fe5_e25e_48cf_938a_716b1fa3d838.slice - libcontainer container kubepods-besteffort-pod933e7fe5_e25e_48cf_938a_716b1fa3d838.slice. Jan 15 00:23:26.786834 systemd[1]: Created slice kubepods-besteffort-pod782417a5_ecd0_40c5_85c0_45ead5d347fd.slice - libcontainer container kubepods-besteffort-pod782417a5_ecd0_40c5_85c0_45ead5d347fd.slice. Jan 15 00:23:26.793075 systemd[1]: Created slice kubepods-besteffort-pod576324d0_4c45_424a_9f35_c0de23b9b1ac.slice - libcontainer container kubepods-besteffort-pod576324d0_4c45_424a_9f35_c0de23b9b1ac.slice. Jan 15 00:23:27.408655 containerd[1703]: time="2026-01-15T00:23:27.408489050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d7bdbd78b-nn9k9,Uid:933e7fe5-e25e-48cf-938a-716b1fa3d838,Namespace:calico-apiserver,Attempt:0,}" Jan 15 00:23:27.630014 containerd[1703]: time="2026-01-15T00:23:27.629857927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-67zqw,Uid:d39e4a7d-cae8-4f75-a430-aaeae5982278,Namespace:kube-system,Attempt:0,}" Jan 15 00:23:27.633134 containerd[1703]: time="2026-01-15T00:23:27.633085057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d7bdbd78b-v4vh7,Uid:782417a5-ecd0-40c5-85c0-45ead5d347fd,Namespace:calico-apiserver,Attempt:0,}" Jan 15 00:23:27.703693 containerd[1703]: time="2026-01-15T00:23:27.703563152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7877d5fb5-c885v,Uid:051b417e-bac4-4f72-8b07-3775d126567f,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:27.703693 containerd[1703]: time="2026-01-15T00:23:27.703650553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7585bf5dbc-745zl,Uid:9b5a7ef8-11b0-402a-a15b-6cd8553f5062,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:27.708187 containerd[1703]: time="2026-01-15T00:23:27.708081686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ksjks,Uid:cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd,Namespace:kube-system,Attempt:0,}" Jan 15 00:23:27.708187 containerd[1703]: time="2026-01-15T00:23:27.708143166Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-cd46w,Uid:576324d0-4c45-424a-9f35-c0de23b9b1ac,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:27.876960 systemd[1]: Created slice kubepods-besteffort-pode8af8aef_db47_4bb0_9303_531f44a2593e.slice - libcontainer container kubepods-besteffort-pode8af8aef_db47_4bb0_9303_531f44a2593e.slice. Jan 15 00:23:27.879475 containerd[1703]: time="2026-01-15T00:23:27.879418250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-92nsn,Uid:e8af8aef-db47-4bb0-9303-531f44a2593e,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:28.251981 containerd[1703]: time="2026-01-15T00:23:28.251928950Z" level=error msg="Failed to destroy network for sandbox \"3e4cb4a4bb61c49e8aba59191b9969f6c5743023ce4cde313f0834157ffebb14\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.256606 containerd[1703]: time="2026-01-15T00:23:28.256548684Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ksjks,Uid:cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e4cb4a4bb61c49e8aba59191b9969f6c5743023ce4cde313f0834157ffebb14\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.256992 kubelet[2931]: E0115 00:23:28.256927 2931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e4cb4a4bb61c49e8aba59191b9969f6c5743023ce4cde313f0834157ffebb14\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.257438 kubelet[2931]: E0115 00:23:28.257019 2931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e4cb4a4bb61c49e8aba59191b9969f6c5743023ce4cde313f0834157ffebb14\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ksjks" Jan 15 00:23:28.257438 kubelet[2931]: E0115 00:23:28.257040 2931 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e4cb4a4bb61c49e8aba59191b9969f6c5743023ce4cde313f0834157ffebb14\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ksjks" Jan 15 00:23:28.257438 kubelet[2931]: E0115 00:23:28.257079 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-ksjks_kube-system(cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-ksjks_kube-system(cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3e4cb4a4bb61c49e8aba59191b9969f6c5743023ce4cde313f0834157ffebb14\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-ksjks" podUID="cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd" Jan 15 00:23:28.263853 containerd[1703]: time="2026-01-15T00:23:28.263797146Z" level=error msg="Failed to destroy network for sandbox \"9c1b145ccf691159820d7dd76cae6f9f838e11fa9c221206fe952c19aaf3a58b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.265513 containerd[1703]: time="2026-01-15T00:23:28.265456311Z" level=error msg="Failed to destroy network for sandbox \"afed95acf64380ea572039d4d9e579054c7a651d3971ad05b5ec3ae9bb90a626\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.266898 containerd[1703]: time="2026-01-15T00:23:28.266863595Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d7bdbd78b-nn9k9,Uid:933e7fe5-e25e-48cf-938a-716b1fa3d838,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c1b145ccf691159820d7dd76cae6f9f838e11fa9c221206fe952c19aaf3a58b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.267384 kubelet[2931]: E0115 00:23:28.267320 2931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c1b145ccf691159820d7dd76cae6f9f838e11fa9c221206fe952c19aaf3a58b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.267465 kubelet[2931]: E0115 00:23:28.267403 2931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c1b145ccf691159820d7dd76cae6f9f838e11fa9c221206fe952c19aaf3a58b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" Jan 15 00:23:28.267465 kubelet[2931]: E0115 00:23:28.267427 2931 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c1b145ccf691159820d7dd76cae6f9f838e11fa9c221206fe952c19aaf3a58b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" Jan 15 00:23:28.267525 kubelet[2931]: E0115 00:23:28.267470 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-d7bdbd78b-nn9k9_calico-apiserver(933e7fe5-e25e-48cf-938a-716b1fa3d838)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-d7bdbd78b-nn9k9_calico-apiserver(933e7fe5-e25e-48cf-938a-716b1fa3d838)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9c1b145ccf691159820d7dd76cae6f9f838e11fa9c221206fe952c19aaf3a58b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:23:28.270417 containerd[1703]: time="2026-01-15T00:23:28.270367766Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7877d5fb5-c885v,Uid:051b417e-bac4-4f72-8b07-3775d126567f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"afed95acf64380ea572039d4d9e579054c7a651d3971ad05b5ec3ae9bb90a626\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.270601 kubelet[2931]: E0115 00:23:28.270565 2931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afed95acf64380ea572039d4d9e579054c7a651d3971ad05b5ec3ae9bb90a626\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.270656 kubelet[2931]: E0115 00:23:28.270622 2931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afed95acf64380ea572039d4d9e579054c7a651d3971ad05b5ec3ae9bb90a626\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" Jan 15 00:23:28.270656 kubelet[2931]: E0115 00:23:28.270641 2931 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afed95acf64380ea572039d4d9e579054c7a651d3971ad05b5ec3ae9bb90a626\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" Jan 15 00:23:28.270711 kubelet[2931]: E0115 00:23:28.270674 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7877d5fb5-c885v_calico-system(051b417e-bac4-4f72-8b07-3775d126567f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7877d5fb5-c885v_calico-system(051b417e-bac4-4f72-8b07-3775d126567f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"afed95acf64380ea572039d4d9e579054c7a651d3971ad05b5ec3ae9bb90a626\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:23:28.273409 containerd[1703]: time="2026-01-15T00:23:28.273339055Z" level=error msg="Failed to destroy network for sandbox \"eb3e21ee1c3d0f834121b7d0380e3419516c7ca8240b1e9df5d3d12da10dfbfc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.276329 containerd[1703]: time="2026-01-15T00:23:28.276284584Z" level=error msg="Failed to destroy network for sandbox \"895e4e9f34854f193a515c2b0dd9a89b97c69871aa0f57224946957b7f789e25\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.277885 containerd[1703]: time="2026-01-15T00:23:28.277847549Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7585bf5dbc-745zl,Uid:9b5a7ef8-11b0-402a-a15b-6cd8553f5062,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb3e21ee1c3d0f834121b7d0380e3419516c7ca8240b1e9df5d3d12da10dfbfc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.278371 kubelet[2931]: E0115 00:23:28.278325 2931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb3e21ee1c3d0f834121b7d0380e3419516c7ca8240b1e9df5d3d12da10dfbfc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.278462 kubelet[2931]: E0115 00:23:28.278386 2931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb3e21ee1c3d0f834121b7d0380e3419516c7ca8240b1e9df5d3d12da10dfbfc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7585bf5dbc-745zl" Jan 15 00:23:28.278462 kubelet[2931]: E0115 00:23:28.278405 2931 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb3e21ee1c3d0f834121b7d0380e3419516c7ca8240b1e9df5d3d12da10dfbfc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7585bf5dbc-745zl" Jan 15 00:23:28.278515 kubelet[2931]: E0115 00:23:28.278449 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7585bf5dbc-745zl_calico-system(9b5a7ef8-11b0-402a-a15b-6cd8553f5062)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7585bf5dbc-745zl_calico-system(9b5a7ef8-11b0-402a-a15b-6cd8553f5062)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eb3e21ee1c3d0f834121b7d0380e3419516c7ca8240b1e9df5d3d12da10dfbfc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7585bf5dbc-745zl" podUID="9b5a7ef8-11b0-402a-a15b-6cd8553f5062" Jan 15 00:23:28.279329 containerd[1703]: time="2026-01-15T00:23:28.279279993Z" level=error msg="Failed to destroy network for sandbox \"4d186123bef9263d8dc2036b152c2e5d910afdf8460fe7073e3666cd4a860b89\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.281445 containerd[1703]: time="2026-01-15T00:23:28.281410160Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-92nsn,Uid:e8af8aef-db47-4bb0-9303-531f44a2593e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"895e4e9f34854f193a515c2b0dd9a89b97c69871aa0f57224946957b7f789e25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.281779 containerd[1703]: time="2026-01-15T00:23:28.281740481Z" level=error msg="Failed to destroy network for sandbox \"eecf27fc2cfe5da588e1e7d792dca66a5af44284270db017593d08e7e4f8330f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.282082 kubelet[2931]: E0115 00:23:28.282048 2931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"895e4e9f34854f193a515c2b0dd9a89b97c69871aa0f57224946957b7f789e25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.282186 kubelet[2931]: E0115 00:23:28.282113 2931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"895e4e9f34854f193a515c2b0dd9a89b97c69871aa0f57224946957b7f789e25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-92nsn" Jan 15 00:23:28.282186 kubelet[2931]: E0115 00:23:28.282134 2931 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"895e4e9f34854f193a515c2b0dd9a89b97c69871aa0f57224946957b7f789e25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-92nsn" Jan 15 00:23:28.282270 kubelet[2931]: E0115 00:23:28.282218 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"895e4e9f34854f193a515c2b0dd9a89b97c69871aa0f57224946957b7f789e25\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:28.283168 containerd[1703]: time="2026-01-15T00:23:28.283033445Z" level=error msg="Failed to destroy network for sandbox \"019f078bddae88f39ec3533c1a94a998a7b9f85bc4efbefd253f1f1689b8744e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.284805 containerd[1703]: time="2026-01-15T00:23:28.284707170Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-cd46w,Uid:576324d0-4c45-424a-9f35-c0de23b9b1ac,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"eecf27fc2cfe5da588e1e7d792dca66a5af44284270db017593d08e7e4f8330f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.284953 kubelet[2931]: E0115 00:23:28.284914 2931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eecf27fc2cfe5da588e1e7d792dca66a5af44284270db017593d08e7e4f8330f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.285100 kubelet[2931]: E0115 00:23:28.285015 2931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eecf27fc2cfe5da588e1e7d792dca66a5af44284270db017593d08e7e4f8330f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-cd46w" Jan 15 00:23:28.285100 kubelet[2931]: E0115 00:23:28.285038 2931 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eecf27fc2cfe5da588e1e7d792dca66a5af44284270db017593d08e7e4f8330f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-cd46w" Jan 15 00:23:28.285210 kubelet[2931]: E0115 00:23:28.285092 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-cd46w_calico-system(576324d0-4c45-424a-9f35-c0de23b9b1ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-cd46w_calico-system(576324d0-4c45-424a-9f35-c0de23b9b1ac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eecf27fc2cfe5da588e1e7d792dca66a5af44284270db017593d08e7e4f8330f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:23:28.286010 containerd[1703]: time="2026-01-15T00:23:28.285963094Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d7bdbd78b-v4vh7,Uid:782417a5-ecd0-40c5-85c0-45ead5d347fd,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d186123bef9263d8dc2036b152c2e5d910afdf8460fe7073e3666cd4a860b89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.286246 kubelet[2931]: E0115 00:23:28.286143 2931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d186123bef9263d8dc2036b152c2e5d910afdf8460fe7073e3666cd4a860b89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.286246 kubelet[2931]: E0115 00:23:28.286218 2931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d186123bef9263d8dc2036b152c2e5d910afdf8460fe7073e3666cd4a860b89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" Jan 15 00:23:28.286246 kubelet[2931]: E0115 00:23:28.286237 2931 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d186123bef9263d8dc2036b152c2e5d910afdf8460fe7073e3666cd4a860b89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" Jan 15 00:23:28.286327 kubelet[2931]: E0115 00:23:28.286291 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-d7bdbd78b-v4vh7_calico-apiserver(782417a5-ecd0-40c5-85c0-45ead5d347fd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-d7bdbd78b-v4vh7_calico-apiserver(782417a5-ecd0-40c5-85c0-45ead5d347fd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4d186123bef9263d8dc2036b152c2e5d910afdf8460fe7073e3666cd4a860b89\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:23:28.289131 containerd[1703]: time="2026-01-15T00:23:28.288955343Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-67zqw,Uid:d39e4a7d-cae8-4f75-a430-aaeae5982278,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"019f078bddae88f39ec3533c1a94a998a7b9f85bc4efbefd253f1f1689b8744e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.289340 kubelet[2931]: E0115 00:23:28.289292 2931 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"019f078bddae88f39ec3533c1a94a998a7b9f85bc4efbefd253f1f1689b8744e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 00:23:28.289385 kubelet[2931]: E0115 00:23:28.289338 2931 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"019f078bddae88f39ec3533c1a94a998a7b9f85bc4efbefd253f1f1689b8744e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-67zqw" Jan 15 00:23:28.289385 kubelet[2931]: E0115 00:23:28.289361 2931 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"019f078bddae88f39ec3533c1a94a998a7b9f85bc4efbefd253f1f1689b8744e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-67zqw" Jan 15 00:23:28.289439 kubelet[2931]: E0115 00:23:28.289394 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-67zqw_kube-system(d39e4a7d-cae8-4f75-a430-aaeae5982278)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-67zqw_kube-system(d39e4a7d-cae8-4f75-a430-aaeae5982278)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"019f078bddae88f39ec3533c1a94a998a7b9f85bc4efbefd253f1f1689b8744e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-67zqw" podUID="d39e4a7d-cae8-4f75-a430-aaeae5982278" Jan 15 00:23:29.013822 containerd[1703]: time="2026-01-15T00:23:29.013724160Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 15 00:23:29.098879 systemd[1]: run-netns-cni\x2da1e783cc\x2db651\x2df130\x2dc8e6\x2df8838bb611bf.mount: Deactivated successfully. Jan 15 00:23:29.098977 systemd[1]: run-netns-cni\x2de9dd2fb2\x2d48c4\x2dce9e\x2d35e6\x2d3db3cb4732dc.mount: Deactivated successfully. Jan 15 00:23:29.099020 systemd[1]: run-netns-cni\x2d2c7e1b33\x2ddfd0\x2da6a1\x2d9e6b\x2d3b44bd946dc8.mount: Deactivated successfully. Jan 15 00:23:29.099068 systemd[1]: run-netns-cni\x2dff21dc76\x2d04fa\x2df23a\x2dfd55\x2da59a6e002004.mount: Deactivated successfully. Jan 15 00:23:29.099110 systemd[1]: run-netns-cni\x2d2aaab2bf\x2d17cf\x2d0439\x2d7a42\x2deb34642122c7.mount: Deactivated successfully. Jan 15 00:23:35.904248 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2586141027.mount: Deactivated successfully. Jan 15 00:23:35.930099 containerd[1703]: time="2026-01-15T00:23:35.930051475Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:35.935002 containerd[1703]: time="2026-01-15T00:23:35.934942930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 15 00:23:35.936466 containerd[1703]: time="2026-01-15T00:23:35.936410214Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:35.943167 containerd[1703]: time="2026-01-15T00:23:35.943045314Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 00:23:35.943435 containerd[1703]: time="2026-01-15T00:23:35.943393516Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 6.929614675s" Jan 15 00:23:35.943435 containerd[1703]: time="2026-01-15T00:23:35.943431716Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 15 00:23:35.954745 containerd[1703]: time="2026-01-15T00:23:35.954707230Z" level=info msg="CreateContainer within sandbox \"d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 15 00:23:35.979636 containerd[1703]: time="2026-01-15T00:23:35.978392343Z" level=info msg="Container 2fcaf57cf90f04b7899d5cef15df33368aacd5e0cf13be3d22a705c256f86c0c: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:23:35.995733 containerd[1703]: time="2026-01-15T00:23:35.995685355Z" level=info msg="CreateContainer within sandbox \"d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"2fcaf57cf90f04b7899d5cef15df33368aacd5e0cf13be3d22a705c256f86c0c\"" Jan 15 00:23:35.996536 containerd[1703]: time="2026-01-15T00:23:35.996508998Z" level=info msg="StartContainer for \"2fcaf57cf90f04b7899d5cef15df33368aacd5e0cf13be3d22a705c256f86c0c\"" Jan 15 00:23:35.998371 containerd[1703]: time="2026-01-15T00:23:35.998340844Z" level=info msg="connecting to shim 2fcaf57cf90f04b7899d5cef15df33368aacd5e0cf13be3d22a705c256f86c0c" address="unix:///run/containerd/s/02da95c7790ec6d92d44d4add68da87d8f7286b5aa506bb12dc72ccb99e74722" protocol=ttrpc version=3 Jan 15 00:23:36.021456 systemd[1]: Started cri-containerd-2fcaf57cf90f04b7899d5cef15df33368aacd5e0cf13be3d22a705c256f86c0c.scope - libcontainer container 2fcaf57cf90f04b7899d5cef15df33368aacd5e0cf13be3d22a705c256f86c0c. Jan 15 00:23:36.089000 audit: BPF prog-id=172 op=LOAD Jan 15 00:23:36.093037 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 15 00:23:36.093102 kernel: audit: type=1334 audit(1768436616.089:584): prog-id=172 op=LOAD Jan 15 00:23:36.093127 kernel: audit: type=1300 audit(1768436616.089:584): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3456 pid=4061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:36.089000 audit[4061]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3456 pid=4061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:36.097198 kernel: audit: type=1327 audit(1768436616.089:584): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266636166353763663930663034623738393964356365663135646633 Jan 15 00:23:36.089000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266636166353763663930663034623738393964356365663135646633 Jan 15 00:23:36.090000 audit: BPF prog-id=173 op=LOAD Jan 15 00:23:36.102022 kernel: audit: type=1334 audit(1768436616.090:585): prog-id=173 op=LOAD Jan 15 00:23:36.102081 kernel: audit: type=1300 audit(1768436616.090:585): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3456 pid=4061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:36.090000 audit[4061]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3456 pid=4061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:36.090000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266636166353763663930663034623738393964356365663135646633 Jan 15 00:23:36.110023 kernel: audit: type=1327 audit(1768436616.090:585): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266636166353763663930663034623738393964356365663135646633 Jan 15 00:23:36.110313 kernel: audit: type=1334 audit(1768436616.090:586): prog-id=173 op=UNLOAD Jan 15 00:23:36.090000 audit: BPF prog-id=173 op=UNLOAD Jan 15 00:23:36.090000 audit[4061]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=4061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:36.115217 kernel: audit: type=1300 audit(1768436616.090:586): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=4061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:36.090000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266636166353763663930663034623738393964356365663135646633 Jan 15 00:23:36.119346 kernel: audit: type=1327 audit(1768436616.090:586): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266636166353763663930663034623738393964356365663135646633 Jan 15 00:23:36.119452 kernel: audit: type=1334 audit(1768436616.090:587): prog-id=172 op=UNLOAD Jan 15 00:23:36.090000 audit: BPF prog-id=172 op=UNLOAD Jan 15 00:23:36.090000 audit[4061]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3456 pid=4061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:36.090000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266636166353763663930663034623738393964356365663135646633 Jan 15 00:23:36.090000 audit: BPF prog-id=174 op=LOAD Jan 15 00:23:36.090000 audit[4061]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3456 pid=4061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:36.090000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266636166353763663930663034623738393964356365663135646633 Jan 15 00:23:36.133642 containerd[1703]: time="2026-01-15T00:23:36.133601617Z" level=info msg="StartContainer for \"2fcaf57cf90f04b7899d5cef15df33368aacd5e0cf13be3d22a705c256f86c0c\" returns successfully" Jan 15 00:23:36.279531 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 15 00:23:36.279650 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 15 00:23:36.447198 kubelet[2931]: I0115 00:23:36.447028 2931 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-whisker-ca-bundle\") pod \"9b5a7ef8-11b0-402a-a15b-6cd8553f5062\" (UID: \"9b5a7ef8-11b0-402a-a15b-6cd8553f5062\") " Jan 15 00:23:36.447198 kubelet[2931]: I0115 00:23:36.447084 2931 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkx7m\" (UniqueName: \"kubernetes.io/projected/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-kube-api-access-bkx7m\") pod \"9b5a7ef8-11b0-402a-a15b-6cd8553f5062\" (UID: \"9b5a7ef8-11b0-402a-a15b-6cd8553f5062\") " Jan 15 00:23:36.447198 kubelet[2931]: I0115 00:23:36.447105 2931 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-whisker-backend-key-pair\") pod \"9b5a7ef8-11b0-402a-a15b-6cd8553f5062\" (UID: \"9b5a7ef8-11b0-402a-a15b-6cd8553f5062\") " Jan 15 00:23:36.448569 kubelet[2931]: I0115 00:23:36.448412 2931 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "9b5a7ef8-11b0-402a-a15b-6cd8553f5062" (UID: "9b5a7ef8-11b0-402a-a15b-6cd8553f5062"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 15 00:23:36.450671 kubelet[2931]: I0115 00:23:36.450632 2931 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "9b5a7ef8-11b0-402a-a15b-6cd8553f5062" (UID: "9b5a7ef8-11b0-402a-a15b-6cd8553f5062"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 15 00:23:36.450903 kubelet[2931]: I0115 00:23:36.450874 2931 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-kube-api-access-bkx7m" (OuterVolumeSpecName: "kube-api-access-bkx7m") pod "9b5a7ef8-11b0-402a-a15b-6cd8553f5062" (UID: "9b5a7ef8-11b0-402a-a15b-6cd8553f5062"). InnerVolumeSpecName "kube-api-access-bkx7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 15 00:23:36.548294 kubelet[2931]: I0115 00:23:36.548114 2931 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-whisker-backend-key-pair\") on node \"ci-4515-1-0-n-1ddc109f0f\" DevicePath \"\"" Jan 15 00:23:36.548294 kubelet[2931]: I0115 00:23:36.548149 2931 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-whisker-ca-bundle\") on node \"ci-4515-1-0-n-1ddc109f0f\" DevicePath \"\"" Jan 15 00:23:36.548294 kubelet[2931]: I0115 00:23:36.548160 2931 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkx7m\" (UniqueName: \"kubernetes.io/projected/9b5a7ef8-11b0-402a-a15b-6cd8553f5062-kube-api-access-bkx7m\") on node \"ci-4515-1-0-n-1ddc109f0f\" DevicePath \"\"" Jan 15 00:23:36.878563 systemd[1]: Removed slice kubepods-besteffort-pod9b5a7ef8_11b0_402a_a15b_6cd8553f5062.slice - libcontainer container kubepods-besteffort-pod9b5a7ef8_11b0_402a_a15b_6cd8553f5062.slice. Jan 15 00:23:36.905125 systemd[1]: var-lib-kubelet-pods-9b5a7ef8\x2d11b0\x2d402a\x2da15b\x2d6cd8553f5062-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dbkx7m.mount: Deactivated successfully. Jan 15 00:23:36.905235 systemd[1]: var-lib-kubelet-pods-9b5a7ef8\x2d11b0\x2d402a\x2da15b\x2d6cd8553f5062-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 15 00:23:37.072558 kubelet[2931]: I0115 00:23:37.072454 2931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-rh7w5" podStartSLOduration=2.434662921 podStartE2EDuration="36.072433689s" podCreationTimestamp="2026-01-15 00:23:01 +0000 UTC" firstStartedPulling="2026-01-15 00:23:02.306765951 +0000 UTC m=+23.524349298" lastFinishedPulling="2026-01-15 00:23:35.944536719 +0000 UTC m=+57.162120066" observedRunningTime="2026-01-15 00:23:37.058036325 +0000 UTC m=+58.275619672" watchObservedRunningTime="2026-01-15 00:23:37.072433689 +0000 UTC m=+58.290016996" Jan 15 00:23:37.118714 systemd[1]: Created slice kubepods-besteffort-pod14a4f6b6_e857_4a63_b075_14b068610222.slice - libcontainer container kubepods-besteffort-pod14a4f6b6_e857_4a63_b075_14b068610222.slice. Jan 15 00:23:37.152462 kubelet[2931]: I0115 00:23:37.152230 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flcf6\" (UniqueName: \"kubernetes.io/projected/14a4f6b6-e857-4a63-b075-14b068610222-kube-api-access-flcf6\") pod \"whisker-955f9fbff-rtvzr\" (UID: \"14a4f6b6-e857-4a63-b075-14b068610222\") " pod="calico-system/whisker-955f9fbff-rtvzr" Jan 15 00:23:37.152462 kubelet[2931]: I0115 00:23:37.152341 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14a4f6b6-e857-4a63-b075-14b068610222-whisker-ca-bundle\") pod \"whisker-955f9fbff-rtvzr\" (UID: \"14a4f6b6-e857-4a63-b075-14b068610222\") " pod="calico-system/whisker-955f9fbff-rtvzr" Jan 15 00:23:37.152706 kubelet[2931]: I0115 00:23:37.152487 2931 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/14a4f6b6-e857-4a63-b075-14b068610222-whisker-backend-key-pair\") pod \"whisker-955f9fbff-rtvzr\" (UID: \"14a4f6b6-e857-4a63-b075-14b068610222\") " pod="calico-system/whisker-955f9fbff-rtvzr" Jan 15 00:23:37.423800 containerd[1703]: time="2026-01-15T00:23:37.423666723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-955f9fbff-rtvzr,Uid:14a4f6b6-e857-4a63-b075-14b068610222,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:37.747000 audit: BPF prog-id=175 op=LOAD Jan 15 00:23:37.747000 audit[4261]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcd768748 a2=98 a3=ffffcd768738 items=0 ppid=4162 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.747000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 00:23:37.747000 audit: BPF prog-id=175 op=UNLOAD Jan 15 00:23:37.747000 audit[4261]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffcd768718 a3=0 items=0 ppid=4162 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.747000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 00:23:37.748000 audit: BPF prog-id=176 op=LOAD Jan 15 00:23:37.748000 audit[4261]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcd7685f8 a2=74 a3=95 items=0 ppid=4162 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.748000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 00:23:37.748000 audit: BPF prog-id=176 op=UNLOAD Jan 15 00:23:37.748000 audit[4261]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4162 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.748000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 00:23:37.748000 audit: BPF prog-id=177 op=LOAD Jan 15 00:23:37.748000 audit[4261]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcd768628 a2=40 a3=ffffcd768658 items=0 ppid=4162 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.748000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 00:23:37.748000 audit: BPF prog-id=177 op=UNLOAD Jan 15 00:23:37.748000 audit[4261]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffcd768658 items=0 ppid=4162 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.748000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 00:23:37.750000 audit: BPF prog-id=178 op=LOAD Jan 15 00:23:37.750000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffee75da98 a2=98 a3=ffffee75da88 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.750000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.751000 audit: BPF prog-id=178 op=UNLOAD Jan 15 00:23:37.751000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffee75da68 a3=0 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.751000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.751000 audit: BPF prog-id=179 op=LOAD Jan 15 00:23:37.751000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffee75d728 a2=74 a3=95 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.751000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.751000 audit: BPF prog-id=179 op=UNLOAD Jan 15 00:23:37.751000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.751000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.751000 audit: BPF prog-id=180 op=LOAD Jan 15 00:23:37.751000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffee75d788 a2=94 a3=2 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.751000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.751000 audit: BPF prog-id=180 op=UNLOAD Jan 15 00:23:37.751000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.751000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.854000 audit: BPF prog-id=181 op=LOAD Jan 15 00:23:37.854000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffee75d748 a2=40 a3=ffffee75d778 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.854000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.854000 audit: BPF prog-id=181 op=UNLOAD Jan 15 00:23:37.854000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffee75d778 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.854000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.863000 audit: BPF prog-id=182 op=LOAD Jan 15 00:23:37.863000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffee75d758 a2=94 a3=4 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.863000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.864000 audit: BPF prog-id=182 op=UNLOAD Jan 15 00:23:37.864000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.864000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.864000 audit: BPF prog-id=183 op=LOAD Jan 15 00:23:37.864000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffee75d598 a2=94 a3=5 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.864000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.864000 audit: BPF prog-id=183 op=UNLOAD Jan 15 00:23:37.864000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.864000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.864000 audit: BPF prog-id=184 op=LOAD Jan 15 00:23:37.864000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffee75d7c8 a2=94 a3=6 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.864000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.864000 audit: BPF prog-id=184 op=UNLOAD Jan 15 00:23:37.864000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.864000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.864000 audit: BPF prog-id=185 op=LOAD Jan 15 00:23:37.864000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffee75cf98 a2=94 a3=83 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.864000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.864000 audit: BPF prog-id=186 op=LOAD Jan 15 00:23:37.864000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffee75cd58 a2=94 a3=2 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.864000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.864000 audit: BPF prog-id=186 op=UNLOAD Jan 15 00:23:37.864000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.864000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.865000 audit: BPF prog-id=185 op=UNLOAD Jan 15 00:23:37.865000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=3c01b620 a3=3c00eb00 items=0 ppid=4162 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.865000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 00:23:37.874000 audit: BPF prog-id=187 op=LOAD Jan 15 00:23:37.874000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd21ead08 a2=98 a3=ffffd21eacf8 items=0 ppid=4162 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.874000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 00:23:37.874000 audit: BPF prog-id=187 op=UNLOAD Jan 15 00:23:37.874000 audit[4281]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd21eacd8 a3=0 items=0 ppid=4162 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.874000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 00:23:37.874000 audit: BPF prog-id=188 op=LOAD Jan 15 00:23:37.874000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd21eabb8 a2=74 a3=95 items=0 ppid=4162 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.874000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 00:23:37.874000 audit: BPF prog-id=188 op=UNLOAD Jan 15 00:23:37.874000 audit[4281]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4162 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.874000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 00:23:37.874000 audit: BPF prog-id=189 op=LOAD Jan 15 00:23:37.874000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd21eabe8 a2=40 a3=ffffd21eac18 items=0 ppid=4162 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.874000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 00:23:37.875000 audit: BPF prog-id=189 op=UNLOAD Jan 15 00:23:37.875000 audit[4281]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffd21eac18 items=0 ppid=4162 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:37.875000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 00:23:38.650575 systemd-networkd[1612]: vxlan.calico: Link UP Jan 15 00:23:38.650585 systemd-networkd[1612]: vxlan.calico: Gained carrier Jan 15 00:23:38.670000 audit: BPF prog-id=190 op=LOAD Jan 15 00:23:38.670000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe5cfd028 a2=98 a3=ffffe5cfd018 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.670000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=190 op=UNLOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe5cfcff8 a3=0 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=191 op=LOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe5cfcd08 a2=74 a3=95 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=191 op=UNLOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=192 op=LOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe5cfcd68 a2=94 a3=2 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=192 op=UNLOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=193 op=LOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe5cfcbe8 a2=40 a3=ffffe5cfcc18 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=193 op=UNLOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffe5cfcc18 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=194 op=LOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe5cfcd38 a2=94 a3=b7 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=194 op=UNLOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=195 op=LOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe5cfc3e8 a2=94 a3=2 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=195 op=UNLOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.671000 audit: BPF prog-id=196 op=LOAD Jan 15 00:23:38.671000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe5cfc578 a2=94 a3=30 items=0 ppid=4162 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.671000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 00:23:38.676000 audit: BPF prog-id=197 op=LOAD Jan 15 00:23:38.676000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc99e6ee8 a2=98 a3=ffffc99e6ed8 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.676000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.676000 audit: BPF prog-id=197 op=UNLOAD Jan 15 00:23:38.676000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc99e6eb8 a3=0 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.676000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.677000 audit: BPF prog-id=198 op=LOAD Jan 15 00:23:38.677000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc99e6b78 a2=74 a3=95 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.677000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.677000 audit: BPF prog-id=198 op=UNLOAD Jan 15 00:23:38.677000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.677000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.677000 audit: BPF prog-id=199 op=LOAD Jan 15 00:23:38.677000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc99e6bd8 a2=94 a3=2 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.677000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.677000 audit: BPF prog-id=199 op=UNLOAD Jan 15 00:23:38.677000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.677000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.785000 audit: BPF prog-id=200 op=LOAD Jan 15 00:23:38.785000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc99e6b98 a2=40 a3=ffffc99e6bc8 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.785000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.786000 audit: BPF prog-id=200 op=UNLOAD Jan 15 00:23:38.786000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffc99e6bc8 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.786000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.800000 audit: BPF prog-id=201 op=LOAD Jan 15 00:23:38.800000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc99e6ba8 a2=94 a3=4 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.800000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.800000 audit: BPF prog-id=201 op=UNLOAD Jan 15 00:23:38.800000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.800000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.801000 audit: BPF prog-id=202 op=LOAD Jan 15 00:23:38.801000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc99e69e8 a2=94 a3=5 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.801000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.801000 audit: BPF prog-id=202 op=UNLOAD Jan 15 00:23:38.801000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.801000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.801000 audit: BPF prog-id=203 op=LOAD Jan 15 00:23:38.801000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc99e6c18 a2=94 a3=6 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.801000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.801000 audit: BPF prog-id=203 op=UNLOAD Jan 15 00:23:38.801000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.801000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.801000 audit: BPF prog-id=204 op=LOAD Jan 15 00:23:38.801000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc99e63e8 a2=94 a3=83 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.801000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.801000 audit: BPF prog-id=205 op=LOAD Jan 15 00:23:38.801000 audit[4334]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffc99e61a8 a2=94 a3=2 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.801000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.801000 audit: BPF prog-id=205 op=UNLOAD Jan 15 00:23:38.801000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.801000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.802000 audit: BPF prog-id=204 op=UNLOAD Jan 15 00:23:38.802000 audit[4334]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=35d98620 a3=35d8bb00 items=0 ppid=4162 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.802000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 00:23:38.809000 audit: BPF prog-id=196 op=UNLOAD Jan 15 00:23:38.809000 audit[4162]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=400090e140 a2=0 a3=0 items=0 ppid=4153 pid=4162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.809000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 15 00:23:38.855000 audit[4383]: NETFILTER_CFG table=mangle:121 family=2 entries=16 op=nft_register_chain pid=4383 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:38.855000 audit[4383]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=fffff63d44f0 a2=0 a3=ffff8cbd1fa8 items=0 ppid=4162 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.855000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:38.855000 audit[4385]: NETFILTER_CFG table=nat:122 family=2 entries=15 op=nft_register_chain pid=4385 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:38.855000 audit[4385]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffcd04b500 a2=0 a3=ffff85ff0fa8 items=0 ppid=4162 pid=4385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.855000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:38.864000 audit[4388]: NETFILTER_CFG table=filter:123 family=2 entries=39 op=nft_register_chain pid=4388 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:38.864000 audit[4388]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=18968 a0=3 a1=fffff3afa210 a2=0 a3=ffff9af0ffa8 items=0 ppid=4162 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.864000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:38.864000 audit[4384]: NETFILTER_CFG table=raw:124 family=2 entries=21 op=nft_register_chain pid=4384 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:38.864000 audit[4384]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=fffffdcfaab0 a2=0 a3=ffffb9c4dfa8 items=0 ppid=4162 pid=4384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.864000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:38.873270 containerd[1703]: time="2026-01-15T00:23:38.873030196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d7bdbd78b-v4vh7,Uid:782417a5-ecd0-40c5-85c0-45ead5d347fd,Namespace:calico-apiserver,Attempt:0,}" Jan 15 00:23:38.877192 kubelet[2931]: I0115 00:23:38.876521 2931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5a7ef8-11b0-402a-a15b-6cd8553f5062" path="/var/lib/kubelet/pods/9b5a7ef8-11b0-402a-a15b-6cd8553f5062/volumes" Jan 15 00:23:38.932952 systemd-networkd[1612]: calief1fd2fe0fb: Link UP Jan 15 00:23:38.935582 systemd-networkd[1612]: calief1fd2fe0fb: Gained carrier Jan 15 00:23:38.949585 containerd[1703]: 2026-01-15 00:23:38.808 [INFO][4343] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0 whisker-955f9fbff- calico-system 14a4f6b6-e857-4a63-b075-14b068610222 946 0 2026-01-15 00:23:37 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:955f9fbff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4515-1-0-n-1ddc109f0f whisker-955f9fbff-rtvzr eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calief1fd2fe0fb [] [] }} ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Namespace="calico-system" Pod="whisker-955f9fbff-rtvzr" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-" Jan 15 00:23:38.949585 containerd[1703]: 2026-01-15 00:23:38.808 [INFO][4343] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Namespace="calico-system" Pod="whisker-955f9fbff-rtvzr" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" Jan 15 00:23:38.949585 containerd[1703]: 2026-01-15 00:23:38.865 [INFO][4359] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" HandleID="k8s-pod-network.5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" Jan 15 00:23:38.949791 containerd[1703]: 2026-01-15 00:23:38.865 [INFO][4359] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" HandleID="k8s-pod-network.5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136490), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-1ddc109f0f", "pod":"whisker-955f9fbff-rtvzr", "timestamp":"2026-01-15 00:23:38.865235053 +0000 UTC"}, Hostname:"ci-4515-1-0-n-1ddc109f0f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 00:23:38.949791 containerd[1703]: 2026-01-15 00:23:38.865 [INFO][4359] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 00:23:38.949791 containerd[1703]: 2026-01-15 00:23:38.865 [INFO][4359] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 00:23:38.949791 containerd[1703]: 2026-01-15 00:23:38.865 [INFO][4359] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-1ddc109f0f' Jan 15 00:23:38.949791 containerd[1703]: 2026-01-15 00:23:38.884 [INFO][4359] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:38.949791 containerd[1703]: 2026-01-15 00:23:38.890 [INFO][4359] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:38.949791 containerd[1703]: 2026-01-15 00:23:38.896 [INFO][4359] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:38.949791 containerd[1703]: 2026-01-15 00:23:38.902 [INFO][4359] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:38.949791 containerd[1703]: 2026-01-15 00:23:38.904 [INFO][4359] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:38.949968 containerd[1703]: 2026-01-15 00:23:38.905 [INFO][4359] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:38.949968 containerd[1703]: 2026-01-15 00:23:38.906 [INFO][4359] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232 Jan 15 00:23:38.949968 containerd[1703]: 2026-01-15 00:23:38.916 [INFO][4359] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:38.949968 containerd[1703]: 2026-01-15 00:23:38.924 [INFO][4359] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.1/26] block=192.168.106.0/26 handle="k8s-pod-network.5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:38.949968 containerd[1703]: 2026-01-15 00:23:38.924 [INFO][4359] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.1/26] handle="k8s-pod-network.5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:38.949968 containerd[1703]: 2026-01-15 00:23:38.924 [INFO][4359] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 00:23:38.949968 containerd[1703]: 2026-01-15 00:23:38.924 [INFO][4359] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.1/26] IPv6=[] ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" HandleID="k8s-pod-network.5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" Jan 15 00:23:38.950150 containerd[1703]: 2026-01-15 00:23:38.930 [INFO][4343] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Namespace="calico-system" Pod="whisker-955f9fbff-rtvzr" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0", GenerateName:"whisker-955f9fbff-", Namespace:"calico-system", SelfLink:"", UID:"14a4f6b6-e857-4a63-b075-14b068610222", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 23, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"955f9fbff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"", Pod:"whisker-955f9fbff-rtvzr", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.106.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calief1fd2fe0fb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:38.950150 containerd[1703]: 2026-01-15 00:23:38.930 [INFO][4343] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.1/32] ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Namespace="calico-system" Pod="whisker-955f9fbff-rtvzr" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" Jan 15 00:23:38.950243 containerd[1703]: 2026-01-15 00:23:38.930 [INFO][4343] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calief1fd2fe0fb ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Namespace="calico-system" Pod="whisker-955f9fbff-rtvzr" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" Jan 15 00:23:38.950243 containerd[1703]: 2026-01-15 00:23:38.933 [INFO][4343] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Namespace="calico-system" Pod="whisker-955f9fbff-rtvzr" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" Jan 15 00:23:38.950286 containerd[1703]: 2026-01-15 00:23:38.934 [INFO][4343] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Namespace="calico-system" Pod="whisker-955f9fbff-rtvzr" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0", GenerateName:"whisker-955f9fbff-", Namespace:"calico-system", SelfLink:"", UID:"14a4f6b6-e857-4a63-b075-14b068610222", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 23, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"955f9fbff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232", Pod:"whisker-955f9fbff-rtvzr", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.106.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calief1fd2fe0fb", MAC:"ba:e9:9f:5b:76:44", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:38.950333 containerd[1703]: 2026-01-15 00:23:38.945 [INFO][4343] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" Namespace="calico-system" Pod="whisker-955f9fbff-rtvzr" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-whisker--955f9fbff--rtvzr-eth0" Jan 15 00:23:38.978703 containerd[1703]: time="2026-01-15T00:23:38.978652639Z" level=info msg="connecting to shim 5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232" address="unix:///run/containerd/s/2508d05c1f964125b9bcb16805d9b3eb0c7969de5a8a6c309a6a700db03957e7" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:38.969000 audit[4428]: NETFILTER_CFG table=filter:125 family=2 entries=59 op=nft_register_chain pid=4428 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:38.969000 audit[4428]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=35860 a0=3 a1=ffffcef23ba0 a2=0 a3=ffff9275dfa8 items=0 ppid=4162 pid=4428 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:38.969000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:39.008523 systemd[1]: Started cri-containerd-5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232.scope - libcontainer container 5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232. Jan 15 00:23:39.021000 audit: BPF prog-id=206 op=LOAD Jan 15 00:23:39.022000 audit: BPF prog-id=207 op=LOAD Jan 15 00:23:39.022000 audit[4450]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40000fe180 a2=98 a3=0 items=0 ppid=4437 pid=4450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566633761663635313639356435613336303366386261363635643634 Jan 15 00:23:39.022000 audit: BPF prog-id=207 op=UNLOAD Jan 15 00:23:39.022000 audit[4450]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4437 pid=4450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566633761663635313639356435613336303366386261363635643634 Jan 15 00:23:39.022000 audit: BPF prog-id=208 op=LOAD Jan 15 00:23:39.022000 audit[4450]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40000fe3e8 a2=98 a3=0 items=0 ppid=4437 pid=4450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566633761663635313639356435613336303366386261363635643634 Jan 15 00:23:39.022000 audit: BPF prog-id=209 op=LOAD Jan 15 00:23:39.022000 audit[4450]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40000fe168 a2=98 a3=0 items=0 ppid=4437 pid=4450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566633761663635313639356435613336303366386261363635643634 Jan 15 00:23:39.022000 audit: BPF prog-id=209 op=UNLOAD Jan 15 00:23:39.022000 audit[4450]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4437 pid=4450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566633761663635313639356435613336303366386261363635643634 Jan 15 00:23:39.022000 audit: BPF prog-id=208 op=UNLOAD Jan 15 00:23:39.022000 audit[4450]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4437 pid=4450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566633761663635313639356435613336303366386261363635643634 Jan 15 00:23:39.022000 audit: BPF prog-id=210 op=LOAD Jan 15 00:23:39.022000 audit[4450]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40000fe648 a2=98 a3=0 items=0 ppid=4437 pid=4450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.022000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566633761663635313639356435613336303366386261363635643634 Jan 15 00:23:39.025695 systemd-networkd[1612]: cali02be44445d6: Link UP Jan 15 00:23:39.025918 systemd-networkd[1612]: cali02be44445d6: Gained carrier Jan 15 00:23:39.039736 containerd[1703]: 2026-01-15 00:23:38.921 [INFO][4395] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0 calico-apiserver-d7bdbd78b- calico-apiserver 782417a5-ecd0-40c5-85c0-45ead5d347fd 873 0 2026-01-15 00:22:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:d7bdbd78b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-n-1ddc109f0f calico-apiserver-d7bdbd78b-v4vh7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali02be44445d6 [] [] }} ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-v4vh7" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-" Jan 15 00:23:39.039736 containerd[1703]: 2026-01-15 00:23:38.921 [INFO][4395] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-v4vh7" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" Jan 15 00:23:39.039736 containerd[1703]: 2026-01-15 00:23:38.957 [INFO][4412] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" HandleID="k8s-pod-network.228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" Jan 15 00:23:39.039943 containerd[1703]: 2026-01-15 00:23:38.958 [INFO][4412] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" HandleID="k8s-pod-network.228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd590), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-n-1ddc109f0f", "pod":"calico-apiserver-d7bdbd78b-v4vh7", "timestamp":"2026-01-15 00:23:38.957909456 +0000 UTC"}, Hostname:"ci-4515-1-0-n-1ddc109f0f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 00:23:39.039943 containerd[1703]: 2026-01-15 00:23:38.958 [INFO][4412] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 00:23:39.039943 containerd[1703]: 2026-01-15 00:23:38.958 [INFO][4412] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 00:23:39.039943 containerd[1703]: 2026-01-15 00:23:38.958 [INFO][4412] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-1ddc109f0f' Jan 15 00:23:39.039943 containerd[1703]: 2026-01-15 00:23:38.985 [INFO][4412] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.039943 containerd[1703]: 2026-01-15 00:23:38.993 [INFO][4412] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.039943 containerd[1703]: 2026-01-15 00:23:39.000 [INFO][4412] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.039943 containerd[1703]: 2026-01-15 00:23:39.003 [INFO][4412] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.039943 containerd[1703]: 2026-01-15 00:23:39.005 [INFO][4412] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.040139 containerd[1703]: 2026-01-15 00:23:39.005 [INFO][4412] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.040139 containerd[1703]: 2026-01-15 00:23:39.007 [INFO][4412] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291 Jan 15 00:23:39.040139 containerd[1703]: 2026-01-15 00:23:39.014 [INFO][4412] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.040139 containerd[1703]: 2026-01-15 00:23:39.019 [INFO][4412] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.2/26] block=192.168.106.0/26 handle="k8s-pod-network.228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.040139 containerd[1703]: 2026-01-15 00:23:39.019 [INFO][4412] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.2/26] handle="k8s-pod-network.228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.040139 containerd[1703]: 2026-01-15 00:23:39.019 [INFO][4412] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 00:23:39.040139 containerd[1703]: 2026-01-15 00:23:39.019 [INFO][4412] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.2/26] IPv6=[] ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" HandleID="k8s-pod-network.228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" Jan 15 00:23:39.041366 containerd[1703]: 2026-01-15 00:23:39.021 [INFO][4395] cni-plugin/k8s.go 418: Populated endpoint ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-v4vh7" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0", GenerateName:"calico-apiserver-d7bdbd78b-", Namespace:"calico-apiserver", SelfLink:"", UID:"782417a5-ecd0-40c5-85c0-45ead5d347fd", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d7bdbd78b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"", Pod:"calico-apiserver-d7bdbd78b-v4vh7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali02be44445d6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:39.041429 containerd[1703]: 2026-01-15 00:23:39.021 [INFO][4395] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.2/32] ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-v4vh7" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" Jan 15 00:23:39.041429 containerd[1703]: 2026-01-15 00:23:39.021 [INFO][4395] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali02be44445d6 ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-v4vh7" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" Jan 15 00:23:39.041429 containerd[1703]: 2026-01-15 00:23:39.026 [INFO][4395] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-v4vh7" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" Jan 15 00:23:39.041486 containerd[1703]: 2026-01-15 00:23:39.026 [INFO][4395] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-v4vh7" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0", GenerateName:"calico-apiserver-d7bdbd78b-", Namespace:"calico-apiserver", SelfLink:"", UID:"782417a5-ecd0-40c5-85c0-45ead5d347fd", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d7bdbd78b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291", Pod:"calico-apiserver-d7bdbd78b-v4vh7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali02be44445d6", MAC:"52:85:e4:ec:1a:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:39.041533 containerd[1703]: 2026-01-15 00:23:39.037 [INFO][4395] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-v4vh7" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--v4vh7-eth0" Jan 15 00:23:39.059000 audit[4488]: NETFILTER_CFG table=filter:126 family=2 entries=50 op=nft_register_chain pid=4488 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:39.059000 audit[4488]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28208 a0=3 a1=ffffe866a0b0 a2=0 a3=ffffac0c1fa8 items=0 ppid=4162 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.059000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:39.064699 containerd[1703]: time="2026-01-15T00:23:39.064617302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-955f9fbff-rtvzr,Uid:14a4f6b6-e857-4a63-b075-14b068610222,Namespace:calico-system,Attempt:0,} returns sandbox id \"5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232\"" Jan 15 00:23:39.067074 containerd[1703]: time="2026-01-15T00:23:39.067038270Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 00:23:39.074791 containerd[1703]: time="2026-01-15T00:23:39.074519933Z" level=info msg="connecting to shim 228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291" address="unix:///run/containerd/s/e2b16ba013a7de77f319b3a873e4a985834bdf5d07fecbc8af68ad076f18e4c1" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:39.109458 systemd[1]: Started cri-containerd-228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291.scope - libcontainer container 228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291. Jan 15 00:23:39.118000 audit: BPF prog-id=211 op=LOAD Jan 15 00:23:39.118000 audit: BPF prog-id=212 op=LOAD Jan 15 00:23:39.118000 audit[4509]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4498 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232386439376638666161653963366233353365323430626665363534 Jan 15 00:23:39.119000 audit: BPF prog-id=212 op=UNLOAD Jan 15 00:23:39.119000 audit[4509]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4498 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.119000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232386439376638666161653963366233353365323430626665363534 Jan 15 00:23:39.119000 audit: BPF prog-id=213 op=LOAD Jan 15 00:23:39.119000 audit[4509]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4498 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.119000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232386439376638666161653963366233353365323430626665363534 Jan 15 00:23:39.119000 audit: BPF prog-id=214 op=LOAD Jan 15 00:23:39.119000 audit[4509]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4498 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.119000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232386439376638666161653963366233353365323430626665363534 Jan 15 00:23:39.119000 audit: BPF prog-id=214 op=UNLOAD Jan 15 00:23:39.119000 audit[4509]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4498 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.119000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232386439376638666161653963366233353365323430626665363534 Jan 15 00:23:39.119000 audit: BPF prog-id=213 op=UNLOAD Jan 15 00:23:39.119000 audit[4509]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4498 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.119000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232386439376638666161653963366233353365323430626665363534 Jan 15 00:23:39.119000 audit: BPF prog-id=215 op=LOAD Jan 15 00:23:39.119000 audit[4509]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4498 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:39.119000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232386439376638666161653963366233353365323430626665363534 Jan 15 00:23:39.145345 containerd[1703]: time="2026-01-15T00:23:39.145301509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d7bdbd78b-v4vh7,Uid:782417a5-ecd0-40c5-85c0-45ead5d347fd,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291\"" Jan 15 00:23:39.408079 containerd[1703]: time="2026-01-15T00:23:39.407925872Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:39.409367 containerd[1703]: time="2026-01-15T00:23:39.409251917Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 00:23:39.409792 kubelet[2931]: E0115 00:23:39.409570 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:23:39.409792 kubelet[2931]: E0115 00:23:39.409620 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:23:39.409893 containerd[1703]: time="2026-01-15T00:23:39.409338357Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:39.409936 containerd[1703]: time="2026-01-15T00:23:39.409906639Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:23:39.410033 kubelet[2931]: E0115 00:23:39.409970 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:956fb8f8794a4941a07f54797f4e6220,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:39.747472 containerd[1703]: time="2026-01-15T00:23:39.747332991Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:39.748713 containerd[1703]: time="2026-01-15T00:23:39.748663595Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:23:39.748775 containerd[1703]: time="2026-01-15T00:23:39.748705035Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:39.749182 kubelet[2931]: E0115 00:23:39.749131 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:23:39.749249 kubelet[2931]: E0115 00:23:39.749227 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:23:39.749487 kubelet[2931]: E0115 00:23:39.749437 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5q7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-v4vh7_calico-apiserver(782417a5-ecd0-40c5-85c0-45ead5d347fd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:39.749917 containerd[1703]: time="2026-01-15T00:23:39.749888678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 00:23:39.750791 kubelet[2931]: E0115 00:23:39.750677 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:23:39.873102 containerd[1703]: time="2026-01-15T00:23:39.873063575Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-cd46w,Uid:576324d0-4c45-424a-9f35-c0de23b9b1ac,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:39.921611 systemd-networkd[1612]: vxlan.calico: Gained IPv6LL Jan 15 00:23:39.979763 systemd-networkd[1612]: cali0f51190cdb6: Link UP Jan 15 00:23:39.980330 systemd-networkd[1612]: cali0f51190cdb6: Gained carrier Jan 15 00:23:39.994726 containerd[1703]: 2026-01-15 00:23:39.908 [INFO][4535] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0 goldmane-666569f655- calico-system 576324d0-4c45-424a-9f35-c0de23b9b1ac 876 0 2026-01-15 00:22:59 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4515-1-0-n-1ddc109f0f goldmane-666569f655-cd46w eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali0f51190cdb6 [] [] }} ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Namespace="calico-system" Pod="goldmane-666569f655-cd46w" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-" Jan 15 00:23:39.994726 containerd[1703]: 2026-01-15 00:23:39.909 [INFO][4535] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Namespace="calico-system" Pod="goldmane-666569f655-cd46w" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" Jan 15 00:23:39.994726 containerd[1703]: 2026-01-15 00:23:39.936 [INFO][4549] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" HandleID="k8s-pod-network.100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" Jan 15 00:23:39.995115 containerd[1703]: 2026-01-15 00:23:39.936 [INFO][4549] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" HandleID="k8s-pod-network.100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dcfd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-1ddc109f0f", "pod":"goldmane-666569f655-cd46w", "timestamp":"2026-01-15 00:23:39.936059008 +0000 UTC"}, Hostname:"ci-4515-1-0-n-1ddc109f0f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 00:23:39.995115 containerd[1703]: 2026-01-15 00:23:39.936 [INFO][4549] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 00:23:39.995115 containerd[1703]: 2026-01-15 00:23:39.936 [INFO][4549] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 00:23:39.995115 containerd[1703]: 2026-01-15 00:23:39.936 [INFO][4549] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-1ddc109f0f' Jan 15 00:23:39.995115 containerd[1703]: 2026-01-15 00:23:39.950 [INFO][4549] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.995115 containerd[1703]: 2026-01-15 00:23:39.954 [INFO][4549] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.995115 containerd[1703]: 2026-01-15 00:23:39.959 [INFO][4549] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.995115 containerd[1703]: 2026-01-15 00:23:39.962 [INFO][4549] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.995115 containerd[1703]: 2026-01-15 00:23:39.964 [INFO][4549] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.995399 containerd[1703]: 2026-01-15 00:23:39.964 [INFO][4549] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.995399 containerd[1703]: 2026-01-15 00:23:39.965 [INFO][4549] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13 Jan 15 00:23:39.995399 containerd[1703]: 2026-01-15 00:23:39.969 [INFO][4549] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.995399 containerd[1703]: 2026-01-15 00:23:39.975 [INFO][4549] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.3/26] block=192.168.106.0/26 handle="k8s-pod-network.100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.995399 containerd[1703]: 2026-01-15 00:23:39.975 [INFO][4549] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.3/26] handle="k8s-pod-network.100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:39.995399 containerd[1703]: 2026-01-15 00:23:39.976 [INFO][4549] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 00:23:39.995399 containerd[1703]: 2026-01-15 00:23:39.976 [INFO][4549] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.3/26] IPv6=[] ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" HandleID="k8s-pod-network.100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" Jan 15 00:23:39.995530 containerd[1703]: 2026-01-15 00:23:39.977 [INFO][4535] cni-plugin/k8s.go 418: Populated endpoint ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Namespace="calico-system" Pod="goldmane-666569f655-cd46w" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"576324d0-4c45-424a-9f35-c0de23b9b1ac", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"", Pod:"goldmane-666569f655-cd46w", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.106.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0f51190cdb6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:39.995584 containerd[1703]: 2026-01-15 00:23:39.978 [INFO][4535] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.3/32] ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Namespace="calico-system" Pod="goldmane-666569f655-cd46w" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" Jan 15 00:23:39.995584 containerd[1703]: 2026-01-15 00:23:39.978 [INFO][4535] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0f51190cdb6 ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Namespace="calico-system" Pod="goldmane-666569f655-cd46w" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" Jan 15 00:23:39.995584 containerd[1703]: 2026-01-15 00:23:39.980 [INFO][4535] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Namespace="calico-system" Pod="goldmane-666569f655-cd46w" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" Jan 15 00:23:39.995642 containerd[1703]: 2026-01-15 00:23:39.981 [INFO][4535] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Namespace="calico-system" Pod="goldmane-666569f655-cd46w" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"576324d0-4c45-424a-9f35-c0de23b9b1ac", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13", Pod:"goldmane-666569f655-cd46w", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.106.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0f51190cdb6", MAC:"c2:b8:b3:2e:b2:09", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:39.995688 containerd[1703]: 2026-01-15 00:23:39.991 [INFO][4535] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" Namespace="calico-system" Pod="goldmane-666569f655-cd46w" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-goldmane--666569f655--cd46w-eth0" Jan 15 00:23:40.009000 audit[4567]: NETFILTER_CFG table=filter:127 family=2 entries=54 op=nft_register_chain pid=4567 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:40.009000 audit[4567]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=29220 a0=3 a1=fffff4af7440 a2=0 a3=ffff9ac09fa8 items=0 ppid=4162 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.009000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:40.020759 containerd[1703]: time="2026-01-15T00:23:40.020716187Z" level=info msg="connecting to shim 100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13" address="unix:///run/containerd/s/6a20298a6ac87c4a5da2b09c4c39cee0ea98a4034925d25b3ff2738f30355eba" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:40.046201 kubelet[2931]: E0115 00:23:40.045535 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:23:40.046574 systemd[1]: Started cri-containerd-100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13.scope - libcontainer container 100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13. Jan 15 00:23:40.067000 audit: BPF prog-id=216 op=LOAD Jan 15 00:23:40.068000 audit: BPF prog-id=217 op=LOAD Jan 15 00:23:40.068000 audit[4586]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=4575 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130306263623633333861393961386163666361626631333662323335 Jan 15 00:23:40.068000 audit: BPF prog-id=217 op=UNLOAD Jan 15 00:23:40.068000 audit[4586]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130306263623633333861393961386163666361626631333662323335 Jan 15 00:23:40.068000 audit: BPF prog-id=218 op=LOAD Jan 15 00:23:40.068000 audit[4586]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=4575 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130306263623633333861393961386163666361626631333662323335 Jan 15 00:23:40.068000 audit: BPF prog-id=219 op=LOAD Jan 15 00:23:40.068000 audit[4586]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=4575 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130306263623633333861393961386163666361626631333662323335 Jan 15 00:23:40.068000 audit: BPF prog-id=219 op=UNLOAD Jan 15 00:23:40.068000 audit[4586]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130306263623633333861393961386163666361626631333662323335 Jan 15 00:23:40.068000 audit: BPF prog-id=218 op=UNLOAD Jan 15 00:23:40.068000 audit[4586]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4575 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130306263623633333861393961386163666361626631333662323335 Jan 15 00:23:40.068000 audit: BPF prog-id=220 op=LOAD Jan 15 00:23:40.068000 audit[4586]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=4575 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130306263623633333861393961386163666361626631333662323335 Jan 15 00:23:40.071000 audit[4606]: NETFILTER_CFG table=filter:128 family=2 entries=20 op=nft_register_rule pid=4606 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:40.071000 audit[4606]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff04169b0 a2=0 a3=1 items=0 ppid=3036 pid=4606 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.071000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:40.079000 audit[4606]: NETFILTER_CFG table=nat:129 family=2 entries=14 op=nft_register_rule pid=4606 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:40.079000 audit[4606]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=fffff04169b0 a2=0 a3=1 items=0 ppid=3036 pid=4606 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:40.079000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:40.083535 containerd[1703]: time="2026-01-15T00:23:40.083500739Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:40.085602 containerd[1703]: time="2026-01-15T00:23:40.085567465Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 00:23:40.085806 containerd[1703]: time="2026-01-15T00:23:40.085742066Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:40.086031 kubelet[2931]: E0115 00:23:40.085983 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:23:40.086031 kubelet[2931]: E0115 00:23:40.086041 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:23:40.086228 kubelet[2931]: E0115 00:23:40.086146 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:40.087531 kubelet[2931]: E0115 00:23:40.087491 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:23:40.095293 containerd[1703]: time="2026-01-15T00:23:40.095252655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-cd46w,Uid:576324d0-4c45-424a-9f35-c0de23b9b1ac,Namespace:calico-system,Attempt:0,} returns sandbox id \"100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13\"" Jan 15 00:23:40.097846 containerd[1703]: time="2026-01-15T00:23:40.097744982Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 00:23:40.416030 containerd[1703]: time="2026-01-15T00:23:40.415866595Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:40.417107 containerd[1703]: time="2026-01-15T00:23:40.416984439Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 00:23:40.417107 containerd[1703]: time="2026-01-15T00:23:40.416991799Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:40.417301 kubelet[2931]: E0115 00:23:40.417244 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:23:40.417301 kubelet[2931]: E0115 00:23:40.417296 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:23:40.417490 kubelet[2931]: E0115 00:23:40.417434 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqmw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cd46w_calico-system(576324d0-4c45-424a-9f35-c0de23b9b1ac): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:40.418961 kubelet[2931]: E0115 00:23:40.418818 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:23:40.689334 systemd-networkd[1612]: cali02be44445d6: Gained IPv6LL Jan 15 00:23:40.873126 containerd[1703]: time="2026-01-15T00:23:40.872876913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-92nsn,Uid:e8af8aef-db47-4bb0-9303-531f44a2593e,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:40.873126 containerd[1703]: time="2026-01-15T00:23:40.873048874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-67zqw,Uid:d39e4a7d-cae8-4f75-a430-aaeae5982278,Namespace:kube-system,Attempt:0,}" Jan 15 00:23:40.946426 systemd-networkd[1612]: calief1fd2fe0fb: Gained IPv6LL Jan 15 00:23:40.986470 systemd-networkd[1612]: cali34b4055ec5c: Link UP Jan 15 00:23:40.986803 systemd-networkd[1612]: cali34b4055ec5c: Gained carrier Jan 15 00:23:41.005698 containerd[1703]: 2026-01-15 00:23:40.914 [INFO][4616] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0 csi-node-driver- calico-system e8af8aef-db47-4bb0-9303-531f44a2593e 735 0 2026-01-15 00:23:02 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4515-1-0-n-1ddc109f0f csi-node-driver-92nsn eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali34b4055ec5c [] [] }} ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Namespace="calico-system" Pod="csi-node-driver-92nsn" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-" Jan 15 00:23:41.005698 containerd[1703]: 2026-01-15 00:23:40.914 [INFO][4616] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Namespace="calico-system" Pod="csi-node-driver-92nsn" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" Jan 15 00:23:41.005698 containerd[1703]: 2026-01-15 00:23:40.945 [INFO][4644] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" HandleID="k8s-pod-network.188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" Jan 15 00:23:41.006304 containerd[1703]: 2026-01-15 00:23:40.945 [INFO][4644] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" HandleID="k8s-pod-network.188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024a040), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-1ddc109f0f", "pod":"csi-node-driver-92nsn", "timestamp":"2026-01-15 00:23:40.945127574 +0000 UTC"}, Hostname:"ci-4515-1-0-n-1ddc109f0f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 00:23:41.006304 containerd[1703]: 2026-01-15 00:23:40.945 [INFO][4644] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 00:23:41.006304 containerd[1703]: 2026-01-15 00:23:40.945 [INFO][4644] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 00:23:41.006304 containerd[1703]: 2026-01-15 00:23:40.945 [INFO][4644] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-1ddc109f0f' Jan 15 00:23:41.006304 containerd[1703]: 2026-01-15 00:23:40.956 [INFO][4644] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.006304 containerd[1703]: 2026-01-15 00:23:40.960 [INFO][4644] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.006304 containerd[1703]: 2026-01-15 00:23:40.964 [INFO][4644] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.006304 containerd[1703]: 2026-01-15 00:23:40.966 [INFO][4644] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.006304 containerd[1703]: 2026-01-15 00:23:40.969 [INFO][4644] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.006495 containerd[1703]: 2026-01-15 00:23:40.969 [INFO][4644] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.006495 containerd[1703]: 2026-01-15 00:23:40.970 [INFO][4644] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9 Jan 15 00:23:41.006495 containerd[1703]: 2026-01-15 00:23:40.974 [INFO][4644] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.006495 containerd[1703]: 2026-01-15 00:23:40.981 [INFO][4644] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.4/26] block=192.168.106.0/26 handle="k8s-pod-network.188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.006495 containerd[1703]: 2026-01-15 00:23:40.982 [INFO][4644] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.4/26] handle="k8s-pod-network.188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.006495 containerd[1703]: 2026-01-15 00:23:40.982 [INFO][4644] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 00:23:41.006495 containerd[1703]: 2026-01-15 00:23:40.982 [INFO][4644] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.4/26] IPv6=[] ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" HandleID="k8s-pod-network.188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" Jan 15 00:23:41.006615 containerd[1703]: 2026-01-15 00:23:40.984 [INFO][4616] cni-plugin/k8s.go 418: Populated endpoint ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Namespace="calico-system" Pod="csi-node-driver-92nsn" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e8af8aef-db47-4bb0-9303-531f44a2593e", ResourceVersion:"735", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 23, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"", Pod:"csi-node-driver-92nsn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.106.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali34b4055ec5c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:41.006661 containerd[1703]: 2026-01-15 00:23:40.984 [INFO][4616] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.4/32] ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Namespace="calico-system" Pod="csi-node-driver-92nsn" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" Jan 15 00:23:41.006661 containerd[1703]: 2026-01-15 00:23:40.984 [INFO][4616] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali34b4055ec5c ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Namespace="calico-system" Pod="csi-node-driver-92nsn" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" Jan 15 00:23:41.006661 containerd[1703]: 2026-01-15 00:23:40.988 [INFO][4616] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Namespace="calico-system" Pod="csi-node-driver-92nsn" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" Jan 15 00:23:41.007276 containerd[1703]: 2026-01-15 00:23:40.988 [INFO][4616] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Namespace="calico-system" Pod="csi-node-driver-92nsn" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e8af8aef-db47-4bb0-9303-531f44a2593e", ResourceVersion:"735", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 23, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9", Pod:"csi-node-driver-92nsn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.106.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali34b4055ec5c", MAC:"ee:e2:00:45:e9:30", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:41.007345 containerd[1703]: 2026-01-15 00:23:41.004 [INFO][4616] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" Namespace="calico-system" Pod="csi-node-driver-92nsn" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-csi--node--driver--92nsn-eth0" Jan 15 00:23:41.015000 audit[4673]: NETFILTER_CFG table=filter:130 family=2 entries=40 op=nft_register_chain pid=4673 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:41.015000 audit[4673]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=20748 a0=3 a1=fffff93acbe0 a2=0 a3=ffff83fc3fa8 items=0 ppid=4162 pid=4673 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.015000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:41.034611 containerd[1703]: time="2026-01-15T00:23:41.034513048Z" level=info msg="connecting to shim 188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9" address="unix:///run/containerd/s/cc16f48558d29ee10222cbb2e0e34e017ac9df7ec3c4eaf7f00da7e394b05d79" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:41.054432 systemd[1]: Started cri-containerd-188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9.scope - libcontainer container 188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9. Jan 15 00:23:41.056251 kubelet[2931]: E0115 00:23:41.055711 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:23:41.056251 kubelet[2931]: E0115 00:23:41.056038 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:23:41.057291 kubelet[2931]: E0115 00:23:41.056741 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:23:41.068000 audit: BPF prog-id=221 op=LOAD Jan 15 00:23:41.069000 audit: BPF prog-id=222 op=LOAD Jan 15 00:23:41.069000 audit[4694]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=4682 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138386562326330626364306361303336623063306534373235323562 Jan 15 00:23:41.069000 audit: BPF prog-id=222 op=UNLOAD Jan 15 00:23:41.069000 audit[4694]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4682 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138386562326330626364306361303336623063306534373235323562 Jan 15 00:23:41.069000 audit: BPF prog-id=223 op=LOAD Jan 15 00:23:41.069000 audit[4694]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=4682 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138386562326330626364306361303336623063306534373235323562 Jan 15 00:23:41.069000 audit: BPF prog-id=224 op=LOAD Jan 15 00:23:41.069000 audit[4694]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=4682 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138386562326330626364306361303336623063306534373235323562 Jan 15 00:23:41.069000 audit: BPF prog-id=224 op=UNLOAD Jan 15 00:23:41.069000 audit[4694]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4682 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138386562326330626364306361303336623063306534373235323562 Jan 15 00:23:41.069000 audit: BPF prog-id=223 op=UNLOAD Jan 15 00:23:41.069000 audit[4694]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4682 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138386562326330626364306361303336623063306534373235323562 Jan 15 00:23:41.069000 audit: BPF prog-id=225 op=LOAD Jan 15 00:23:41.069000 audit[4694]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=4682 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138386562326330626364306361303336623063306534373235323562 Jan 15 00:23:41.074509 systemd-networkd[1612]: cali0f51190cdb6: Gained IPv6LL Jan 15 00:23:41.083000 audit[4715]: NETFILTER_CFG table=filter:131 family=2 entries=20 op=nft_register_rule pid=4715 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:41.083000 audit[4715]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd2d1e720 a2=0 a3=1 items=0 ppid=3036 pid=4715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.083000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:41.091000 audit[4715]: NETFILTER_CFG table=nat:132 family=2 entries=14 op=nft_register_rule pid=4715 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:41.101088 kernel: kauditd_printk_skb: 312 callbacks suppressed Jan 15 00:23:41.101293 kernel: audit: type=1325 audit(1768436621.091:694): table=nat:132 family=2 entries=14 op=nft_register_rule pid=4715 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:41.091000 audit[4715]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffd2d1e720 a2=0 a3=1 items=0 ppid=3036 pid=4715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.107627 kernel: audit: type=1300 audit(1768436621.091:694): arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffd2d1e720 a2=0 a3=1 items=0 ppid=3036 pid=4715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.091000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:41.109953 kernel: audit: type=1327 audit(1768436621.091:694): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:41.124000 audit[4723]: NETFILTER_CFG table=filter:133 family=2 entries=20 op=nft_register_rule pid=4723 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:41.124000 audit[4723]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe28bf900 a2=0 a3=1 items=0 ppid=3036 pid=4723 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.132038 containerd[1703]: time="2026-01-15T00:23:41.131971626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-92nsn,Uid:e8af8aef-db47-4bb0-9303-531f44a2593e,Namespace:calico-system,Attempt:0,} returns sandbox id \"188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9\"" Jan 15 00:23:41.133101 kernel: audit: type=1325 audit(1768436621.124:695): table=filter:133 family=2 entries=20 op=nft_register_rule pid=4723 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:41.133160 kernel: audit: type=1300 audit(1768436621.124:695): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe28bf900 a2=0 a3=1 items=0 ppid=3036 pid=4723 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.134139 kernel: audit: type=1327 audit(1768436621.124:695): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:41.124000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:41.134216 containerd[1703]: time="2026-01-15T00:23:41.133706071Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 00:23:41.135000 audit[4723]: NETFILTER_CFG table=nat:134 family=2 entries=14 op=nft_register_rule pid=4723 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:41.135000 audit[4723]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffe28bf900 a2=0 a3=1 items=0 ppid=3036 pid=4723 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.143734 kernel: audit: type=1325 audit(1768436621.135:696): table=nat:134 family=2 entries=14 op=nft_register_rule pid=4723 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:41.143821 kernel: audit: type=1300 audit(1768436621.135:696): arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffe28bf900 a2=0 a3=1 items=0 ppid=3036 pid=4723 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.135000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:41.145960 kernel: audit: type=1327 audit(1768436621.135:696): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:41.147006 systemd-networkd[1612]: cali9e8ccc3d01d: Link UP Jan 15 00:23:41.147378 systemd-networkd[1612]: cali9e8ccc3d01d: Gained carrier Jan 15 00:23:41.160455 containerd[1703]: 2026-01-15 00:23:40.922 [INFO][4631] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0 coredns-668d6bf9bc- kube-system d39e4a7d-cae8-4f75-a430-aaeae5982278 865 0 2026-01-15 00:22:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-n-1ddc109f0f coredns-668d6bf9bc-67zqw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9e8ccc3d01d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Namespace="kube-system" Pod="coredns-668d6bf9bc-67zqw" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-" Jan 15 00:23:41.160455 containerd[1703]: 2026-01-15 00:23:40.922 [INFO][4631] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Namespace="kube-system" Pod="coredns-668d6bf9bc-67zqw" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" Jan 15 00:23:41.160455 containerd[1703]: 2026-01-15 00:23:40.949 [INFO][4649] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" HandleID="k8s-pod-network.d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" Jan 15 00:23:41.160680 containerd[1703]: 2026-01-15 00:23:40.949 [INFO][4649] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" HandleID="k8s-pod-network.d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034b060), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-n-1ddc109f0f", "pod":"coredns-668d6bf9bc-67zqw", "timestamp":"2026-01-15 00:23:40.949269547 +0000 UTC"}, Hostname:"ci-4515-1-0-n-1ddc109f0f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 00:23:41.160680 containerd[1703]: 2026-01-15 00:23:40.949 [INFO][4649] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 00:23:41.160680 containerd[1703]: 2026-01-15 00:23:40.982 [INFO][4649] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 00:23:41.160680 containerd[1703]: 2026-01-15 00:23:40.982 [INFO][4649] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-1ddc109f0f' Jan 15 00:23:41.160680 containerd[1703]: 2026-01-15 00:23:41.059 [INFO][4649] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.160680 containerd[1703]: 2026-01-15 00:23:41.088 [INFO][4649] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.160680 containerd[1703]: 2026-01-15 00:23:41.109 [INFO][4649] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.160680 containerd[1703]: 2026-01-15 00:23:41.112 [INFO][4649] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.160680 containerd[1703]: 2026-01-15 00:23:41.118 [INFO][4649] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.160856 containerd[1703]: 2026-01-15 00:23:41.118 [INFO][4649] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.160856 containerd[1703]: 2026-01-15 00:23:41.120 [INFO][4649] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932 Jan 15 00:23:41.160856 containerd[1703]: 2026-01-15 00:23:41.129 [INFO][4649] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.160856 containerd[1703]: 2026-01-15 00:23:41.137 [INFO][4649] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.5/26] block=192.168.106.0/26 handle="k8s-pod-network.d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.160856 containerd[1703]: 2026-01-15 00:23:41.137 [INFO][4649] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.5/26] handle="k8s-pod-network.d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:41.160856 containerd[1703]: 2026-01-15 00:23:41.138 [INFO][4649] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 00:23:41.160856 containerd[1703]: 2026-01-15 00:23:41.138 [INFO][4649] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.5/26] IPv6=[] ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" HandleID="k8s-pod-network.d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" Jan 15 00:23:41.160990 containerd[1703]: 2026-01-15 00:23:41.144 [INFO][4631] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Namespace="kube-system" Pod="coredns-668d6bf9bc-67zqw" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d39e4a7d-cae8-4f75-a430-aaeae5982278", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"", Pod:"coredns-668d6bf9bc-67zqw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9e8ccc3d01d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:41.160990 containerd[1703]: 2026-01-15 00:23:41.144 [INFO][4631] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.5/32] ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Namespace="kube-system" Pod="coredns-668d6bf9bc-67zqw" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" Jan 15 00:23:41.160990 containerd[1703]: 2026-01-15 00:23:41.144 [INFO][4631] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9e8ccc3d01d ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Namespace="kube-system" Pod="coredns-668d6bf9bc-67zqw" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" Jan 15 00:23:41.160990 containerd[1703]: 2026-01-15 00:23:41.147 [INFO][4631] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Namespace="kube-system" Pod="coredns-668d6bf9bc-67zqw" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" Jan 15 00:23:41.160990 containerd[1703]: 2026-01-15 00:23:41.147 [INFO][4631] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Namespace="kube-system" Pod="coredns-668d6bf9bc-67zqw" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d39e4a7d-cae8-4f75-a430-aaeae5982278", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932", Pod:"coredns-668d6bf9bc-67zqw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9e8ccc3d01d", MAC:"82:d8:c6:2c:44:c7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:41.160990 containerd[1703]: 2026-01-15 00:23:41.157 [INFO][4631] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" Namespace="kube-system" Pod="coredns-668d6bf9bc-67zqw" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--67zqw-eth0" Jan 15 00:23:41.180000 audit[4733]: NETFILTER_CFG table=filter:135 family=2 entries=50 op=nft_register_chain pid=4733 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:41.180000 audit[4733]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24912 a0=3 a1=fffffaee3900 a2=0 a3=ffff8ac2bfa8 items=0 ppid=4162 pid=4733 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.180000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:41.184203 kernel: audit: type=1325 audit(1768436621.180:697): table=filter:135 family=2 entries=50 op=nft_register_chain pid=4733 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:41.190734 containerd[1703]: time="2026-01-15T00:23:41.190655885Z" level=info msg="connecting to shim d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932" address="unix:///run/containerd/s/5b720043cd0766d6daaec94ee2b6798004f6e7b9c7743232fada5c0ea7a2989c" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:41.220588 systemd[1]: Started cri-containerd-d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932.scope - libcontainer container d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932. Jan 15 00:23:41.229000 audit: BPF prog-id=226 op=LOAD Jan 15 00:23:41.230000 audit: BPF prog-id=227 op=LOAD Jan 15 00:23:41.230000 audit[4753]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=4743 pid=4753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.230000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437376434306630633534646635663536383934633363353839646464 Jan 15 00:23:41.230000 audit: BPF prog-id=227 op=UNLOAD Jan 15 00:23:41.230000 audit[4753]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4743 pid=4753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.230000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437376434306630633534646635663536383934633363353839646464 Jan 15 00:23:41.230000 audit: BPF prog-id=228 op=LOAD Jan 15 00:23:41.230000 audit[4753]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=4743 pid=4753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.230000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437376434306630633534646635663536383934633363353839646464 Jan 15 00:23:41.230000 audit: BPF prog-id=229 op=LOAD Jan 15 00:23:41.230000 audit[4753]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=4743 pid=4753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.230000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437376434306630633534646635663536383934633363353839646464 Jan 15 00:23:41.230000 audit: BPF prog-id=229 op=UNLOAD Jan 15 00:23:41.230000 audit[4753]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4743 pid=4753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.230000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437376434306630633534646635663536383934633363353839646464 Jan 15 00:23:41.230000 audit: BPF prog-id=228 op=UNLOAD Jan 15 00:23:41.230000 audit[4753]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4743 pid=4753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.230000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437376434306630633534646635663536383934633363353839646464 Jan 15 00:23:41.230000 audit: BPF prog-id=230 op=LOAD Jan 15 00:23:41.230000 audit[4753]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=4743 pid=4753 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.230000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437376434306630633534646635663536383934633363353839646464 Jan 15 00:23:41.254518 containerd[1703]: time="2026-01-15T00:23:41.254464560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-67zqw,Uid:d39e4a7d-cae8-4f75-a430-aaeae5982278,Namespace:kube-system,Attempt:0,} returns sandbox id \"d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932\"" Jan 15 00:23:41.257371 containerd[1703]: time="2026-01-15T00:23:41.257330329Z" level=info msg="CreateContainer within sandbox \"d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 00:23:41.266566 containerd[1703]: time="2026-01-15T00:23:41.266513117Z" level=info msg="Container 453934082c8179e198c87b5fcbf94a618256a2ce3676e0c6be4eaad9aae87331: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:23:41.273926 containerd[1703]: time="2026-01-15T00:23:41.273876220Z" level=info msg="CreateContainer within sandbox \"d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"453934082c8179e198c87b5fcbf94a618256a2ce3676e0c6be4eaad9aae87331\"" Jan 15 00:23:41.274770 containerd[1703]: time="2026-01-15T00:23:41.274677422Z" level=info msg="StartContainer for \"453934082c8179e198c87b5fcbf94a618256a2ce3676e0c6be4eaad9aae87331\"" Jan 15 00:23:41.276117 containerd[1703]: time="2026-01-15T00:23:41.276063587Z" level=info msg="connecting to shim 453934082c8179e198c87b5fcbf94a618256a2ce3676e0c6be4eaad9aae87331" address="unix:///run/containerd/s/5b720043cd0766d6daaec94ee2b6798004f6e7b9c7743232fada5c0ea7a2989c" protocol=ttrpc version=3 Jan 15 00:23:41.293540 systemd[1]: Started cri-containerd-453934082c8179e198c87b5fcbf94a618256a2ce3676e0c6be4eaad9aae87331.scope - libcontainer container 453934082c8179e198c87b5fcbf94a618256a2ce3676e0c6be4eaad9aae87331. Jan 15 00:23:41.295475 systemd[1]: Started sshd@9-10.0.3.29:22-119.84.148.253:58460.service - OpenSSH per-connection server daemon (119.84.148.253:58460). Jan 15 00:23:41.294000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.3.29:22-119.84.148.253:58460 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:23:41.309000 audit: BPF prog-id=231 op=LOAD Jan 15 00:23:41.310000 audit: BPF prog-id=232 op=LOAD Jan 15 00:23:41.310000 audit[4780]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=4743 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.310000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435333933343038326338313739653139386338376235666362663934 Jan 15 00:23:41.310000 audit: BPF prog-id=232 op=UNLOAD Jan 15 00:23:41.310000 audit[4780]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4743 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.310000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435333933343038326338313739653139386338376235666362663934 Jan 15 00:23:41.310000 audit: BPF prog-id=233 op=LOAD Jan 15 00:23:41.310000 audit[4780]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=4743 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.310000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435333933343038326338313739653139386338376235666362663934 Jan 15 00:23:41.310000 audit: BPF prog-id=234 op=LOAD Jan 15 00:23:41.310000 audit[4780]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=4743 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.310000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435333933343038326338313739653139386338376235666362663934 Jan 15 00:23:41.310000 audit: BPF prog-id=234 op=UNLOAD Jan 15 00:23:41.310000 audit[4780]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4743 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.310000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435333933343038326338313739653139386338376235666362663934 Jan 15 00:23:41.310000 audit: BPF prog-id=233 op=UNLOAD Jan 15 00:23:41.310000 audit[4780]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4743 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.310000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435333933343038326338313739653139386338376235666362663934 Jan 15 00:23:41.310000 audit: BPF prog-id=235 op=LOAD Jan 15 00:23:41.310000 audit[4780]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=4743 pid=4780 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:41.310000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435333933343038326338313739653139386338376235666362663934 Jan 15 00:23:41.331427 containerd[1703]: time="2026-01-15T00:23:41.331384716Z" level=info msg="StartContainer for \"453934082c8179e198c87b5fcbf94a618256a2ce3676e0c6be4eaad9aae87331\" returns successfully" Jan 15 00:23:41.466398 containerd[1703]: time="2026-01-15T00:23:41.466275328Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:41.468890 containerd[1703]: time="2026-01-15T00:23:41.468753376Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 00:23:41.469248 containerd[1703]: time="2026-01-15T00:23:41.468789416Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:41.469433 kubelet[2931]: E0115 00:23:41.469052 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:23:41.469433 kubelet[2931]: E0115 00:23:41.469100 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:23:41.469433 kubelet[2931]: E0115 00:23:41.469250 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:41.471288 containerd[1703]: time="2026-01-15T00:23:41.471146623Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 00:23:41.799646 containerd[1703]: time="2026-01-15T00:23:41.799575988Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:41.800973 containerd[1703]: time="2026-01-15T00:23:41.800917992Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 00:23:41.801014 containerd[1703]: time="2026-01-15T00:23:41.800961472Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:41.801213 kubelet[2931]: E0115 00:23:41.801140 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:23:41.801265 kubelet[2931]: E0115 00:23:41.801219 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:23:41.801384 kubelet[2931]: E0115 00:23:41.801334 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:41.803527 kubelet[2931]: E0115 00:23:41.803488 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:41.872803 containerd[1703]: time="2026-01-15T00:23:41.872757412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d7bdbd78b-nn9k9,Uid:933e7fe5-e25e-48cf-938a-716b1fa3d838,Namespace:calico-apiserver,Attempt:0,}" Jan 15 00:23:42.011029 systemd-networkd[1612]: cali9eb1b5d670c: Link UP Jan 15 00:23:42.012277 systemd-networkd[1612]: cali9eb1b5d670c: Gained carrier Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.913 [INFO][4818] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0 calico-apiserver-d7bdbd78b- calico-apiserver 933e7fe5-e25e-48cf-938a-716b1fa3d838 869 0 2026-01-15 00:22:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:d7bdbd78b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-n-1ddc109f0f calico-apiserver-d7bdbd78b-nn9k9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9eb1b5d670c [] [] }} ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-nn9k9" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.913 [INFO][4818] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-nn9k9" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.950 [INFO][4833] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" HandleID="k8s-pod-network.adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.950 [INFO][4833] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" HandleID="k8s-pod-network.adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137410), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-n-1ddc109f0f", "pod":"calico-apiserver-d7bdbd78b-nn9k9", "timestamp":"2026-01-15 00:23:41.950387529 +0000 UTC"}, Hostname:"ci-4515-1-0-n-1ddc109f0f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.951 [INFO][4833] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.951 [INFO][4833] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.951 [INFO][4833] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-1ddc109f0f' Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.964 [INFO][4833] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.971 [INFO][4833] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.979 [INFO][4833] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.982 [INFO][4833] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.985 [INFO][4833] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.985 [INFO][4833] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.988 [INFO][4833] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:41.993 [INFO][4833] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:42.001 [INFO][4833] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.6/26] block=192.168.106.0/26 handle="k8s-pod-network.adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:42.001 [INFO][4833] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.6/26] handle="k8s-pod-network.adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:42.001 [INFO][4833] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 00:23:42.027210 containerd[1703]: 2026-01-15 00:23:42.001 [INFO][4833] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.6/26] IPv6=[] ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" HandleID="k8s-pod-network.adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" Jan 15 00:23:42.027990 containerd[1703]: 2026-01-15 00:23:42.005 [INFO][4818] cni-plugin/k8s.go 418: Populated endpoint ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-nn9k9" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0", GenerateName:"calico-apiserver-d7bdbd78b-", Namespace:"calico-apiserver", SelfLink:"", UID:"933e7fe5-e25e-48cf-938a-716b1fa3d838", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d7bdbd78b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"", Pod:"calico-apiserver-d7bdbd78b-nn9k9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9eb1b5d670c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:42.027990 containerd[1703]: 2026-01-15 00:23:42.005 [INFO][4818] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.6/32] ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-nn9k9" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" Jan 15 00:23:42.027990 containerd[1703]: 2026-01-15 00:23:42.005 [INFO][4818] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9eb1b5d670c ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-nn9k9" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" Jan 15 00:23:42.027990 containerd[1703]: 2026-01-15 00:23:42.010 [INFO][4818] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-nn9k9" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" Jan 15 00:23:42.027990 containerd[1703]: 2026-01-15 00:23:42.010 [INFO][4818] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-nn9k9" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0", GenerateName:"calico-apiserver-d7bdbd78b-", Namespace:"calico-apiserver", SelfLink:"", UID:"933e7fe5-e25e-48cf-938a-716b1fa3d838", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d7bdbd78b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb", Pod:"calico-apiserver-d7bdbd78b-nn9k9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9eb1b5d670c", MAC:"3a:5a:e0:48:2e:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:42.027990 containerd[1703]: 2026-01-15 00:23:42.024 [INFO][4818] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" Namespace="calico-apiserver" Pod="calico-apiserver-d7bdbd78b-nn9k9" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--apiserver--d7bdbd78b--nn9k9-eth0" Jan 15 00:23:42.042000 audit[4849]: NETFILTER_CFG table=filter:136 family=2 entries=55 op=nft_register_chain pid=4849 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:42.042000 audit[4849]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28288 a0=3 a1=fffffa8ec310 a2=0 a3=ffff8204efa8 items=0 ppid=4162 pid=4849 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.042000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:42.067235 kubelet[2931]: E0115 00:23:42.067115 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:23:42.070994 kubelet[2931]: E0115 00:23:42.070933 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:42.082829 containerd[1703]: time="2026-01-15T00:23:42.082396093Z" level=info msg="connecting to shim adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb" address="unix:///run/containerd/s/878a5681d494c446192588b64560da35e394a4d350a2745762f9f4414acd3f83" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:42.089444 kubelet[2931]: I0115 00:23:42.089378 2931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-67zqw" podStartSLOduration=56.089359154 podStartE2EDuration="56.089359154s" podCreationTimestamp="2026-01-15 00:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 00:23:42.085528942 +0000 UTC m=+63.303112289" watchObservedRunningTime="2026-01-15 00:23:42.089359154 +0000 UTC m=+63.306942501" Jan 15 00:23:42.098321 systemd-networkd[1612]: cali34b4055ec5c: Gained IPv6LL Jan 15 00:23:42.155457 systemd[1]: Started cri-containerd-adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb.scope - libcontainer container adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb. Jan 15 00:23:42.168000 audit[4884]: NETFILTER_CFG table=filter:137 family=2 entries=20 op=nft_register_rule pid=4884 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:42.168000 audit[4884]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe4146e00 a2=0 a3=1 items=0 ppid=3036 pid=4884 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.168000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:42.173000 audit[4884]: NETFILTER_CFG table=nat:138 family=2 entries=14 op=nft_register_rule pid=4884 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:42.173000 audit[4884]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffe4146e00 a2=0 a3=1 items=0 ppid=3036 pid=4884 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.173000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:42.178000 audit: BPF prog-id=236 op=LOAD Jan 15 00:23:42.179000 audit: BPF prog-id=237 op=LOAD Jan 15 00:23:42.179000 audit[4871]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=4859 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164663464393163383565323063363766326639346338353838306536 Jan 15 00:23:42.179000 audit: BPF prog-id=237 op=UNLOAD Jan 15 00:23:42.179000 audit[4871]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4859 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164663464393163383565323063363766326639346338353838306536 Jan 15 00:23:42.179000 audit: BPF prog-id=238 op=LOAD Jan 15 00:23:42.179000 audit[4871]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=4859 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164663464393163383565323063363766326639346338353838306536 Jan 15 00:23:42.180000 audit: BPF prog-id=239 op=LOAD Jan 15 00:23:42.180000 audit[4871]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=4859 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164663464393163383565323063363766326639346338353838306536 Jan 15 00:23:42.180000 audit: BPF prog-id=239 op=UNLOAD Jan 15 00:23:42.180000 audit[4871]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4859 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164663464393163383565323063363766326639346338353838306536 Jan 15 00:23:42.180000 audit: BPF prog-id=238 op=UNLOAD Jan 15 00:23:42.180000 audit[4871]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4859 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164663464393163383565323063363766326639346338353838306536 Jan 15 00:23:42.180000 audit: BPF prog-id=240 op=LOAD Jan 15 00:23:42.180000 audit[4871]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=4859 pid=4871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:42.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164663464393163383565323063363766326639346338353838306536 Jan 15 00:23:42.218446 containerd[1703]: time="2026-01-15T00:23:42.218381909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d7bdbd78b-nn9k9,Uid:933e7fe5-e25e-48cf-938a-716b1fa3d838,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb\"" Jan 15 00:23:42.220615 containerd[1703]: time="2026-01-15T00:23:42.220536435Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:23:42.549161 containerd[1703]: time="2026-01-15T00:23:42.549100400Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:42.550485 containerd[1703]: time="2026-01-15T00:23:42.550374644Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:23:42.550485 containerd[1703]: time="2026-01-15T00:23:42.550430404Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:42.550659 kubelet[2931]: E0115 00:23:42.550616 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:23:42.550766 kubelet[2931]: E0115 00:23:42.550674 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:23:42.551197 kubelet[2931]: E0115 00:23:42.551015 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xck2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-nn9k9_calico-apiserver(933e7fe5-e25e-48cf-938a-716b1fa3d838): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:42.553032 kubelet[2931]: E0115 00:23:42.552973 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:23:42.872847 containerd[1703]: time="2026-01-15T00:23:42.872550990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ksjks,Uid:cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd,Namespace:kube-system,Attempt:0,}" Jan 15 00:23:42.929498 systemd-networkd[1612]: cali9e8ccc3d01d: Gained IPv6LL Jan 15 00:23:42.981248 systemd-networkd[1612]: calif7b6c94f4d0: Link UP Jan 15 00:23:42.982054 systemd-networkd[1612]: calif7b6c94f4d0: Gained carrier Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.913 [INFO][4896] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0 coredns-668d6bf9bc- kube-system cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd 875 0 2026-01-15 00:22:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-n-1ddc109f0f coredns-668d6bf9bc-ksjks eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif7b6c94f4d0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Namespace="kube-system" Pod="coredns-668d6bf9bc-ksjks" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.913 [INFO][4896] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Namespace="kube-system" Pod="coredns-668d6bf9bc-ksjks" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.938 [INFO][4911] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" HandleID="k8s-pod-network.d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.938 [INFO][4911] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" HandleID="k8s-pod-network.d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137720), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-n-1ddc109f0f", "pod":"coredns-668d6bf9bc-ksjks", "timestamp":"2026-01-15 00:23:42.938390111 +0000 UTC"}, Hostname:"ci-4515-1-0-n-1ddc109f0f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.938 [INFO][4911] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.938 [INFO][4911] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.938 [INFO][4911] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-1ddc109f0f' Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.948 [INFO][4911] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.953 [INFO][4911] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.957 [INFO][4911] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.958 [INFO][4911] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.961 [INFO][4911] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.961 [INFO][4911] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.962 [INFO][4911] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422 Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.968 [INFO][4911] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.975 [INFO][4911] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.7/26] block=192.168.106.0/26 handle="k8s-pod-network.d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.975 [INFO][4911] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.7/26] handle="k8s-pod-network.d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.975 [INFO][4911] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 00:23:42.995786 containerd[1703]: 2026-01-15 00:23:42.975 [INFO][4911] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.7/26] IPv6=[] ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" HandleID="k8s-pod-network.d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" Jan 15 00:23:42.996779 containerd[1703]: 2026-01-15 00:23:42.977 [INFO][4896] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Namespace="kube-system" Pod="coredns-668d6bf9bc-ksjks" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"", Pod:"coredns-668d6bf9bc-ksjks", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif7b6c94f4d0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:42.996779 containerd[1703]: 2026-01-15 00:23:42.977 [INFO][4896] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.7/32] ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Namespace="kube-system" Pod="coredns-668d6bf9bc-ksjks" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" Jan 15 00:23:42.996779 containerd[1703]: 2026-01-15 00:23:42.977 [INFO][4896] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif7b6c94f4d0 ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Namespace="kube-system" Pod="coredns-668d6bf9bc-ksjks" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" Jan 15 00:23:42.996779 containerd[1703]: 2026-01-15 00:23:42.982 [INFO][4896] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Namespace="kube-system" Pod="coredns-668d6bf9bc-ksjks" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" Jan 15 00:23:42.996779 containerd[1703]: 2026-01-15 00:23:42.983 [INFO][4896] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Namespace="kube-system" Pod="coredns-668d6bf9bc-ksjks" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 22, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422", Pod:"coredns-668d6bf9bc-ksjks", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif7b6c94f4d0", MAC:"2a:ba:ac:35:5d:46", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:42.996779 containerd[1703]: 2026-01-15 00:23:42.992 [INFO][4896] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" Namespace="kube-system" Pod="coredns-668d6bf9bc-ksjks" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-coredns--668d6bf9bc--ksjks-eth0" Jan 15 00:23:43.014000 audit[4928]: NETFILTER_CFG table=filter:139 family=2 entries=44 op=nft_register_chain pid=4928 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:43.014000 audit[4928]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21500 a0=3 a1=fffff21bc680 a2=0 a3=ffffbab58fa8 items=0 ppid=4162 pid=4928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.014000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:43.023406 containerd[1703]: time="2026-01-15T00:23:43.023355251Z" level=info msg="connecting to shim d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422" address="unix:///run/containerd/s/d8e0e55670d8b83a63f9de3a4d601f81354d7aca30bf2163d9d82ddc1b86443d" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:43.050441 systemd[1]: Started cri-containerd-d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422.scope - libcontainer container d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422. Jan 15 00:23:43.061000 audit: BPF prog-id=241 op=LOAD Jan 15 00:23:43.062000 audit: BPF prog-id=242 op=LOAD Jan 15 00:23:43.062000 audit[4949]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=4938 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439346535316361333039366263393766633062356430646366633366 Jan 15 00:23:43.062000 audit: BPF prog-id=242 op=UNLOAD Jan 15 00:23:43.062000 audit[4949]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4938 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439346535316361333039366263393766633062356430646366633366 Jan 15 00:23:43.062000 audit: BPF prog-id=243 op=LOAD Jan 15 00:23:43.062000 audit[4949]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=4938 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439346535316361333039366263393766633062356430646366633366 Jan 15 00:23:43.062000 audit: BPF prog-id=244 op=LOAD Jan 15 00:23:43.062000 audit[4949]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=4938 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439346535316361333039366263393766633062356430646366633366 Jan 15 00:23:43.062000 audit: BPF prog-id=244 op=UNLOAD Jan 15 00:23:43.062000 audit[4949]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4938 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439346535316361333039366263393766633062356430646366633366 Jan 15 00:23:43.062000 audit: BPF prog-id=243 op=UNLOAD Jan 15 00:23:43.062000 audit[4949]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4938 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439346535316361333039366263393766633062356430646366633366 Jan 15 00:23:43.062000 audit: BPF prog-id=245 op=LOAD Jan 15 00:23:43.062000 audit[4949]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=4938 pid=4949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.062000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439346535316361333039366263393766633062356430646366633366 Jan 15 00:23:43.069304 kubelet[2931]: E0115 00:23:43.068413 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:23:43.070573 kubelet[2931]: E0115 00:23:43.070526 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:43.108000 audit[4976]: NETFILTER_CFG table=filter:140 family=2 entries=17 op=nft_register_rule pid=4976 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:43.108000 audit[4976]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffece46760 a2=0 a3=1 items=0 ppid=3036 pid=4976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.108000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:43.110562 containerd[1703]: time="2026-01-15T00:23:43.110033836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ksjks,Uid:cacf3b5e-3a3d-4bb1-94ee-791570a7ddfd,Namespace:kube-system,Attempt:0,} returns sandbox id \"d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422\"" Jan 15 00:23:43.114030 containerd[1703]: time="2026-01-15T00:23:43.113994488Z" level=info msg="CreateContainer within sandbox \"d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 00:23:43.113000 audit[4976]: NETFILTER_CFG table=nat:141 family=2 entries=35 op=nft_register_chain pid=4976 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:43.113000 audit[4976]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffece46760 a2=0 a3=1 items=0 ppid=3036 pid=4976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.113000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:43.132108 containerd[1703]: time="2026-01-15T00:23:43.131706222Z" level=info msg="Container 7906d588f8527998ccd1237ec7dbe84cd824125d3e4cc789dc60cae3a2f4ec84: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:23:43.141664 containerd[1703]: time="2026-01-15T00:23:43.141602053Z" level=info msg="CreateContainer within sandbox \"d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7906d588f8527998ccd1237ec7dbe84cd824125d3e4cc789dc60cae3a2f4ec84\"" Jan 15 00:23:43.142138 containerd[1703]: time="2026-01-15T00:23:43.142100934Z" level=info msg="StartContainer for \"7906d588f8527998ccd1237ec7dbe84cd824125d3e4cc789dc60cae3a2f4ec84\"" Jan 15 00:23:43.144594 containerd[1703]: time="2026-01-15T00:23:43.144569102Z" level=info msg="connecting to shim 7906d588f8527998ccd1237ec7dbe84cd824125d3e4cc789dc60cae3a2f4ec84" address="unix:///run/containerd/s/d8e0e55670d8b83a63f9de3a4d601f81354d7aca30bf2163d9d82ddc1b86443d" protocol=ttrpc version=3 Jan 15 00:23:43.165450 systemd[1]: Started cri-containerd-7906d588f8527998ccd1237ec7dbe84cd824125d3e4cc789dc60cae3a2f4ec84.scope - libcontainer container 7906d588f8527998ccd1237ec7dbe84cd824125d3e4cc789dc60cae3a2f4ec84. Jan 15 00:23:43.174000 audit: BPF prog-id=246 op=LOAD Jan 15 00:23:43.174000 audit: BPF prog-id=247 op=LOAD Jan 15 00:23:43.174000 audit[4977]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=4938 pid=4977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.174000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739303664353838663835323739393863636431323337656337646265 Jan 15 00:23:43.175000 audit: BPF prog-id=247 op=UNLOAD Jan 15 00:23:43.175000 audit[4977]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4938 pid=4977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739303664353838663835323739393863636431323337656337646265 Jan 15 00:23:43.175000 audit: BPF prog-id=248 op=LOAD Jan 15 00:23:43.175000 audit[4977]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=4938 pid=4977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739303664353838663835323739393863636431323337656337646265 Jan 15 00:23:43.175000 audit: BPF prog-id=249 op=LOAD Jan 15 00:23:43.175000 audit[4977]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=4938 pid=4977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739303664353838663835323739393863636431323337656337646265 Jan 15 00:23:43.176000 audit: BPF prog-id=249 op=UNLOAD Jan 15 00:23:43.176000 audit[4977]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4938 pid=4977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.176000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739303664353838663835323739393863636431323337656337646265 Jan 15 00:23:43.176000 audit: BPF prog-id=248 op=UNLOAD Jan 15 00:23:43.176000 audit[4977]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4938 pid=4977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.176000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739303664353838663835323739393863636431323337656337646265 Jan 15 00:23:43.176000 audit: BPF prog-id=250 op=LOAD Jan 15 00:23:43.176000 audit[4977]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=4938 pid=4977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:43.176000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739303664353838663835323739393863636431323337656337646265 Jan 15 00:23:43.193608 containerd[1703]: time="2026-01-15T00:23:43.193237971Z" level=info msg="StartContainer for \"7906d588f8527998ccd1237ec7dbe84cd824125d3e4cc789dc60cae3a2f4ec84\" returns successfully" Jan 15 00:23:43.872423 containerd[1703]: time="2026-01-15T00:23:43.872326888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7877d5fb5-c885v,Uid:051b417e-bac4-4f72-8b07-3775d126567f,Namespace:calico-system,Attempt:0,}" Jan 15 00:23:43.889342 systemd-networkd[1612]: cali9eb1b5d670c: Gained IPv6LL Jan 15 00:23:43.978013 systemd-networkd[1612]: calibdbbd1a050a: Link UP Jan 15 00:23:43.978757 systemd-networkd[1612]: calibdbbd1a050a: Gained carrier Jan 15 00:23:43.994275 sshd[4793]: Connection closed by authenticating user root 119.84.148.253 port 58460 [preauth] Jan 15 00:23:43.993000 audit[4793]: USER_ERR pid=4793 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:23:43.997494 systemd[1]: sshd@9-10.0.3.29:22-119.84.148.253:58460.service: Deactivated successfully. Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.909 [INFO][5015] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0 calico-kube-controllers-7877d5fb5- calico-system 051b417e-bac4-4f72-8b07-3775d126567f 872 0 2026-01-15 00:23:02 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7877d5fb5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4515-1-0-n-1ddc109f0f calico-kube-controllers-7877d5fb5-c885v eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calibdbbd1a050a [] [] }} ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Namespace="calico-system" Pod="calico-kube-controllers-7877d5fb5-c885v" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.909 [INFO][5015] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Namespace="calico-system" Pod="calico-kube-controllers-7877d5fb5-c885v" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.932 [INFO][5031] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" HandleID="k8s-pod-network.c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.932 [INFO][5031] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" HandleID="k8s-pod-network.c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323390), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-1ddc109f0f", "pod":"calico-kube-controllers-7877d5fb5-c885v", "timestamp":"2026-01-15 00:23:43.932243351 +0000 UTC"}, Hostname:"ci-4515-1-0-n-1ddc109f0f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.932 [INFO][5031] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.932 [INFO][5031] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.932 [INFO][5031] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-1ddc109f0f' Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.943 [INFO][5031] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.948 [INFO][5031] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.952 [INFO][5031] ipam/ipam.go 511: Trying affinity for 192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.954 [INFO][5031] ipam/ipam.go 158: Attempting to load block cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.956 [INFO][5031] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.106.0/26 host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.956 [INFO][5031] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.106.0/26 handle="k8s-pod-network.c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.958 [INFO][5031] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1 Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.964 [INFO][5031] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.106.0/26 handle="k8s-pod-network.c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.973 [INFO][5031] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.106.8/26] block=192.168.106.0/26 handle="k8s-pod-network.c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.973 [INFO][5031] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.106.8/26] handle="k8s-pod-network.c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" host="ci-4515-1-0-n-1ddc109f0f" Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.973 [INFO][5031] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 00:23:43.997879 containerd[1703]: 2026-01-15 00:23:43.973 [INFO][5031] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.106.8/26] IPv6=[] ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" HandleID="k8s-pod-network.c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Workload="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" Jan 15 00:23:43.998378 containerd[1703]: 2026-01-15 00:23:43.975 [INFO][5015] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Namespace="calico-system" Pod="calico-kube-controllers-7877d5fb5-c885v" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0", GenerateName:"calico-kube-controllers-7877d5fb5-", Namespace:"calico-system", SelfLink:"", UID:"051b417e-bac4-4f72-8b07-3775d126567f", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 23, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7877d5fb5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"", Pod:"calico-kube-controllers-7877d5fb5-c885v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.106.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibdbbd1a050a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:43.998378 containerd[1703]: 2026-01-15 00:23:43.975 [INFO][5015] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.106.8/32] ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Namespace="calico-system" Pod="calico-kube-controllers-7877d5fb5-c885v" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" Jan 15 00:23:43.998378 containerd[1703]: 2026-01-15 00:23:43.975 [INFO][5015] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibdbbd1a050a ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Namespace="calico-system" Pod="calico-kube-controllers-7877d5fb5-c885v" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" Jan 15 00:23:43.998378 containerd[1703]: 2026-01-15 00:23:43.981 [INFO][5015] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Namespace="calico-system" Pod="calico-kube-controllers-7877d5fb5-c885v" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" Jan 15 00:23:43.998378 containerd[1703]: 2026-01-15 00:23:43.981 [INFO][5015] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Namespace="calico-system" Pod="calico-kube-controllers-7877d5fb5-c885v" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0", GenerateName:"calico-kube-controllers-7877d5fb5-", Namespace:"calico-system", SelfLink:"", UID:"051b417e-bac4-4f72-8b07-3775d126567f", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 0, 23, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7877d5fb5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-1ddc109f0f", ContainerID:"c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1", Pod:"calico-kube-controllers-7877d5fb5-c885v", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.106.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibdbbd1a050a", MAC:"26:b3:f2:f4:5d:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 00:23:43.998378 containerd[1703]: 2026-01-15 00:23:43.992 [INFO][5015] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" Namespace="calico-system" Pod="calico-kube-controllers-7877d5fb5-c885v" WorkloadEndpoint="ci--4515--1--0--n--1ddc109f0f-k8s-calico--kube--controllers--7877d5fb5--c885v-eth0" Jan 15 00:23:43.997000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.3.29:22-119.84.148.253:58460 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:23:44.005000 audit[5052]: NETFILTER_CFG table=filter:142 family=2 entries=52 op=nft_register_chain pid=5052 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 00:23:44.005000 audit[5052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24296 a0=3 a1=ffffd8899430 a2=0 a3=ffff9fbbefa8 items=0 ppid=4162 pid=5052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.005000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 00:23:44.023910 containerd[1703]: time="2026-01-15T00:23:44.023776911Z" level=info msg="connecting to shim c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1" address="unix:///run/containerd/s/fd6da17e738ff1fbe36c80dda091f4747ccedf9ae49427fb73bc8136683d521d" namespace=k8s.io protocol=ttrpc version=3 Jan 15 00:23:44.056429 systemd[1]: Started cri-containerd-c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1.scope - libcontainer container c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1. Jan 15 00:23:44.066000 audit: BPF prog-id=251 op=LOAD Jan 15 00:23:44.067000 audit: BPF prog-id=252 op=LOAD Jan 15 00:23:44.067000 audit[5071]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=5061 pid=5071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.067000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331353630393638313234646635336436333066373736393039396536 Jan 15 00:23:44.067000 audit: BPF prog-id=252 op=UNLOAD Jan 15 00:23:44.067000 audit[5071]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5061 pid=5071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.067000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331353630393638313234646635336436333066373736393039396536 Jan 15 00:23:44.067000 audit: BPF prog-id=253 op=LOAD Jan 15 00:23:44.067000 audit[5071]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=5061 pid=5071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.067000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331353630393638313234646635336436333066373736393039396536 Jan 15 00:23:44.068000 audit: BPF prog-id=254 op=LOAD Jan 15 00:23:44.068000 audit[5071]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=5061 pid=5071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331353630393638313234646635336436333066373736393039396536 Jan 15 00:23:44.068000 audit: BPF prog-id=254 op=UNLOAD Jan 15 00:23:44.068000 audit[5071]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5061 pid=5071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331353630393638313234646635336436333066373736393039396536 Jan 15 00:23:44.068000 audit: BPF prog-id=253 op=UNLOAD Jan 15 00:23:44.068000 audit[5071]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5061 pid=5071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331353630393638313234646635336436333066373736393039396536 Jan 15 00:23:44.068000 audit: BPF prog-id=255 op=LOAD Jan 15 00:23:44.068000 audit[5071]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=5061 pid=5071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331353630393638313234646635336436333066373736393039396536 Jan 15 00:23:44.073892 kubelet[2931]: E0115 00:23:44.073848 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:23:44.088827 kubelet[2931]: I0115 00:23:44.088570 2931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-ksjks" podStartSLOduration=58.088550269 podStartE2EDuration="58.088550269s" podCreationTimestamp="2026-01-15 00:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 00:23:44.086986904 +0000 UTC m=+65.304570251" watchObservedRunningTime="2026-01-15 00:23:44.088550269 +0000 UTC m=+65.306133616" Jan 15 00:23:44.101191 containerd[1703]: time="2026-01-15T00:23:44.101134468Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7877d5fb5-c885v,Uid:051b417e-bac4-4f72-8b07-3775d126567f,Namespace:calico-system,Attempt:0,} returns sandbox id \"c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1\"" Jan 15 00:23:44.102685 containerd[1703]: time="2026-01-15T00:23:44.102654632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 00:23:44.117000 audit[5099]: NETFILTER_CFG table=filter:143 family=2 entries=14 op=nft_register_rule pid=5099 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:44.117000 audit[5099]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd09119b0 a2=0 a3=1 items=0 ppid=3036 pid=5099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.117000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:44.126000 audit[5099]: NETFILTER_CFG table=nat:144 family=2 entries=44 op=nft_register_rule pid=5099 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:44.126000 audit[5099]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffd09119b0 a2=0 a3=1 items=0 ppid=3036 pid=5099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:44.126000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:44.401587 systemd-networkd[1612]: calif7b6c94f4d0: Gained IPv6LL Jan 15 00:23:44.453271 containerd[1703]: time="2026-01-15T00:23:44.452991344Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:44.455934 containerd[1703]: time="2026-01-15T00:23:44.455889033Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 00:23:44.456061 containerd[1703]: time="2026-01-15T00:23:44.456003073Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:44.456206 kubelet[2931]: E0115 00:23:44.456133 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:23:44.456351 kubelet[2931]: E0115 00:23:44.456228 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:23:44.456533 kubelet[2931]: E0115 00:23:44.456464 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-568wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7877d5fb5-c885v_calico-system(051b417e-bac4-4f72-8b07-3775d126567f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:44.457680 kubelet[2931]: E0115 00:23:44.457649 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:23:45.075643 kubelet[2931]: E0115 00:23:45.075578 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:23:45.148000 audit[5101]: NETFILTER_CFG table=filter:145 family=2 entries=14 op=nft_register_rule pid=5101 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:45.148000 audit[5101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe1021210 a2=0 a3=1 items=0 ppid=3036 pid=5101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:45.148000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:45.164000 audit[5101]: NETFILTER_CFG table=nat:146 family=2 entries=56 op=nft_register_chain pid=5101 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:23:45.164000 audit[5101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffe1021210 a2=0 a3=1 items=0 ppid=3036 pid=5101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:23:45.164000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:23:45.937353 systemd-networkd[1612]: calibdbbd1a050a: Gained IPv6LL Jan 15 00:23:46.077656 kubelet[2931]: E0115 00:23:46.077569 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:23:47.352206 kernel: kauditd_printk_skb: 170 callbacks suppressed Jan 15 00:23:47.352364 kernel: audit: type=1130 audit(1768436627.349:760): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.29:22-119.84.148.253:32904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:23:47.349000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.29:22-119.84.148.253:32904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:23:47.349972 systemd[1]: Started sshd@10-10.0.3.29:22-119.84.148.253:32904.service - OpenSSH per-connection server daemon (119.84.148.253:32904). Jan 15 00:23:52.874390 containerd[1703]: time="2026-01-15T00:23:52.874325262Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:23:53.197907 containerd[1703]: time="2026-01-15T00:23:53.197738131Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:53.199036 containerd[1703]: time="2026-01-15T00:23:53.198998655Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:23:53.199119 containerd[1703]: time="2026-01-15T00:23:53.199084495Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:53.199327 kubelet[2931]: E0115 00:23:53.199288 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:23:53.199601 kubelet[2931]: E0115 00:23:53.199340 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:23:53.199601 kubelet[2931]: E0115 00:23:53.199555 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5q7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-v4vh7_calico-apiserver(782417a5-ecd0-40c5-85c0-45ead5d347fd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:53.200329 containerd[1703]: time="2026-01-15T00:23:53.200209579Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 00:23:53.201312 kubelet[2931]: E0115 00:23:53.201263 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:23:53.711972 containerd[1703]: time="2026-01-15T00:23:53.711858144Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:53.713465 containerd[1703]: time="2026-01-15T00:23:53.713411628Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 00:23:53.713501 containerd[1703]: time="2026-01-15T00:23:53.713443109Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:53.713706 kubelet[2931]: E0115 00:23:53.713668 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:23:53.713757 kubelet[2931]: E0115 00:23:53.713715 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:23:53.713873 kubelet[2931]: E0115 00:23:53.713818 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:956fb8f8794a4941a07f54797f4e6220,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:53.715992 containerd[1703]: time="2026-01-15T00:23:53.715954956Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 00:23:54.044983 containerd[1703]: time="2026-01-15T00:23:54.044868202Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:54.046356 containerd[1703]: time="2026-01-15T00:23:54.046306607Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 00:23:54.046447 containerd[1703]: time="2026-01-15T00:23:54.046391647Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:54.046608 kubelet[2931]: E0115 00:23:54.046544 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:23:54.046608 kubelet[2931]: E0115 00:23:54.046601 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:23:54.046990 kubelet[2931]: E0115 00:23:54.046816 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:54.047100 containerd[1703]: time="2026-01-15T00:23:54.047000689Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 00:23:54.048243 kubelet[2931]: E0115 00:23:54.048209 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:23:54.389662 containerd[1703]: time="2026-01-15T00:23:54.389403976Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:54.390744 containerd[1703]: time="2026-01-15T00:23:54.390637780Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 00:23:54.390744 containerd[1703]: time="2026-01-15T00:23:54.390673900Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:54.390910 kubelet[2931]: E0115 00:23:54.390862 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:23:54.391225 kubelet[2931]: E0115 00:23:54.390921 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:23:54.391225 kubelet[2931]: E0115 00:23:54.391033 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:54.393283 containerd[1703]: time="2026-01-15T00:23:54.392982307Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 00:23:54.719212 containerd[1703]: time="2026-01-15T00:23:54.718975104Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:54.720927 containerd[1703]: time="2026-01-15T00:23:54.720867950Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 00:23:54.721018 containerd[1703]: time="2026-01-15T00:23:54.720911830Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:54.721430 kubelet[2931]: E0115 00:23:54.721159 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:23:54.721430 kubelet[2931]: E0115 00:23:54.721272 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:23:54.721430 kubelet[2931]: E0115 00:23:54.721388 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:54.722740 kubelet[2931]: E0115 00:23:54.722689 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:23:55.873418 containerd[1703]: time="2026-01-15T00:23:55.873232715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:23:56.204872 containerd[1703]: time="2026-01-15T00:23:56.204598288Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:56.206276 containerd[1703]: time="2026-01-15T00:23:56.206240213Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:23:56.206373 containerd[1703]: time="2026-01-15T00:23:56.206256813Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:56.206635 kubelet[2931]: E0115 00:23:56.206582 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:23:56.207104 kubelet[2931]: E0115 00:23:56.207079 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:23:56.207358 kubelet[2931]: E0115 00:23:56.207315 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xck2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-nn9k9_calico-apiserver(933e7fe5-e25e-48cf-938a-716b1fa3d838): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:56.208742 kubelet[2931]: E0115 00:23:56.208683 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:23:56.873348 containerd[1703]: time="2026-01-15T00:23:56.873298054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 00:23:57.236808 sshd[5106]: Connection closed by authenticating user root 119.84.148.253 port 32904 [preauth] Jan 15 00:23:57.235000 audit[5106]: USER_ERR pid=5106 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:23:57.244312 kernel: audit: type=1109 audit(1768436637.235:761): pid=5106 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:23:57.244392 kernel: audit: type=1131 audit(1768436637.241:762): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.29:22-119.84.148.253:32904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:23:57.241000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.3.29:22-119.84.148.253:32904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:23:57.242716 systemd[1]: sshd@10-10.0.3.29:22-119.84.148.253:32904.service: Deactivated successfully. Jan 15 00:23:57.294949 containerd[1703]: time="2026-01-15T00:23:57.294846743Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:57.297185 containerd[1703]: time="2026-01-15T00:23:57.297139430Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 00:23:57.297445 containerd[1703]: time="2026-01-15T00:23:57.297280590Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:57.297607 kubelet[2931]: E0115 00:23:57.297560 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:23:57.298028 kubelet[2931]: E0115 00:23:57.297726 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:23:57.298627 kubelet[2931]: E0115 00:23:57.298230 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqmw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cd46w_calico-system(576324d0-4c45-424a-9f35-c0de23b9b1ac): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:57.299987 kubelet[2931]: E0115 00:23:57.299672 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:23:57.873088 containerd[1703]: time="2026-01-15T00:23:57.873025871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 00:23:58.201444 containerd[1703]: time="2026-01-15T00:23:58.201253875Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:23:58.202803 containerd[1703]: time="2026-01-15T00:23:58.202757040Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 00:23:58.202892 containerd[1703]: time="2026-01-15T00:23:58.202851120Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 00:23:58.203031 kubelet[2931]: E0115 00:23:58.202991 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:23:58.203157 kubelet[2931]: E0115 00:23:58.203042 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:23:58.203286 kubelet[2931]: E0115 00:23:58.203163 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-568wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7877d5fb5-c885v_calico-system(051b417e-bac4-4f72-8b07-3775d126567f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 00:23:58.204371 kubelet[2931]: E0115 00:23:58.204333 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:23:58.489528 systemd[1]: Started sshd@11-10.0.3.29:22-119.84.148.253:43908.service - OpenSSH per-connection server daemon (119.84.148.253:43908). Jan 15 00:23:58.488000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.29:22-119.84.148.253:43908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:23:58.494203 kernel: audit: type=1130 audit(1768436638.488:763): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.29:22-119.84.148.253:43908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:00.989260 sshd[5123]: Connection closed by authenticating user root 119.84.148.253 port 43908 [preauth] Jan 15 00:24:00.988000 audit[5123]: USER_ERR pid=5123 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:00.991762 systemd[1]: sshd@11-10.0.3.29:22-119.84.148.253:43908.service: Deactivated successfully. Jan 15 00:24:00.990000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.29:22-119.84.148.253:43908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:00.995901 kernel: audit: type=1109 audit(1768436640.988:764): pid=5123 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:00.996020 kernel: audit: type=1131 audit(1768436640.990:765): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.3.29:22-119.84.148.253:43908 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:02.255153 systemd[1]: Started sshd@12-10.0.3.29:22-119.84.148.253:47862.service - OpenSSH per-connection server daemon (119.84.148.253:47862). Jan 15 00:24:02.254000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.3.29:22-119.84.148.253:47862 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:02.261496 kernel: audit: type=1130 audit(1768436642.254:766): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.3.29:22-119.84.148.253:47862 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:03.874045 kubelet[2931]: E0115 00:24:03.873965 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:24:05.064301 sshd[5135]: Connection closed by authenticating user root 119.84.148.253 port 47862 [preauth] Jan 15 00:24:05.063000 audit[5135]: USER_ERR pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:05.067200 systemd[1]: sshd@12-10.0.3.29:22-119.84.148.253:47862.service: Deactivated successfully. Jan 15 00:24:05.066000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.3.29:22-119.84.148.253:47862 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:05.070947 kernel: audit: type=1109 audit(1768436645.063:767): pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:05.071049 kernel: audit: type=1131 audit(1768436645.066:768): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.3.29:22-119.84.148.253:47862 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:05.317445 systemd[1]: Started sshd@13-10.0.3.29:22-119.84.148.253:51328.service - OpenSSH per-connection server daemon (119.84.148.253:51328). Jan 15 00:24:05.316000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.3.29:22-119.84.148.253:51328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:05.321205 kernel: audit: type=1130 audit(1768436645.316:769): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.3.29:22-119.84.148.253:51328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:05.873184 kubelet[2931]: E0115 00:24:05.873025 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:24:06.307228 sshd[5143]: Connection closed by authenticating user root 119.84.148.253 port 51328 [preauth] Jan 15 00:24:06.306000 audit[5143]: USER_ERR pid=5143 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:06.310024 systemd[1]: sshd@13-10.0.3.29:22-119.84.148.253:51328.service: Deactivated successfully. Jan 15 00:24:06.309000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.3.29:22-119.84.148.253:51328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:06.314225 kernel: audit: type=1109 audit(1768436646.306:770): pid=5143 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:06.314302 kernel: audit: type=1131 audit(1768436646.309:771): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.3.29:22-119.84.148.253:51328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:06.551024 systemd[1]: Started sshd@14-10.0.3.29:22-119.84.148.253:52442.service - OpenSSH per-connection server daemon (119.84.148.253:52442). Jan 15 00:24:06.550000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.29:22-119.84.148.253:52442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:06.554220 kernel: audit: type=1130 audit(1768436646.550:772): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.29:22-119.84.148.253:52442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:07.872755 kubelet[2931]: E0115 00:24:07.872695 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:24:07.873899 kubelet[2931]: E0115 00:24:07.873648 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:24:10.873667 kubelet[2931]: E0115 00:24:10.873617 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:24:10.874227 kubelet[2931]: E0115 00:24:10.873644 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:24:14.875431 containerd[1703]: time="2026-01-15T00:24:14.875385276Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:24:15.225149 containerd[1703]: time="2026-01-15T00:24:15.224323744Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:24:15.225677 containerd[1703]: time="2026-01-15T00:24:15.225629708Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:24:15.225781 containerd[1703]: time="2026-01-15T00:24:15.225716428Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:24:15.226053 kubelet[2931]: E0115 00:24:15.226013 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:24:15.226383 kubelet[2931]: E0115 00:24:15.226064 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:24:15.226383 kubelet[2931]: E0115 00:24:15.226271 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5q7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-v4vh7_calico-apiserver(782417a5-ecd0-40c5-85c0-45ead5d347fd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:24:15.227561 kubelet[2931]: E0115 00:24:15.227523 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:24:19.874048 containerd[1703]: time="2026-01-15T00:24:19.873232923Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 00:24:20.205942 containerd[1703]: time="2026-01-15T00:24:20.205813741Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:24:20.207464 containerd[1703]: time="2026-01-15T00:24:20.207424425Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 00:24:20.207583 containerd[1703]: time="2026-01-15T00:24:20.207511626Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 00:24:20.207754 kubelet[2931]: E0115 00:24:20.207703 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:24:20.209295 kubelet[2931]: E0115 00:24:20.207760 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:24:20.209295 kubelet[2931]: E0115 00:24:20.207887 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:956fb8f8794a4941a07f54797f4e6220,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 00:24:20.210620 containerd[1703]: time="2026-01-15T00:24:20.210582835Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 00:24:20.535256 containerd[1703]: time="2026-01-15T00:24:20.535148548Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:24:20.536732 containerd[1703]: time="2026-01-15T00:24:20.536655112Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 00:24:20.536879 containerd[1703]: time="2026-01-15T00:24:20.536738633Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 00:24:20.537004 kubelet[2931]: E0115 00:24:20.536967 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:24:20.537282 kubelet[2931]: E0115 00:24:20.537082 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:24:20.537282 kubelet[2931]: E0115 00:24:20.537231 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 00:24:20.538436 kubelet[2931]: E0115 00:24:20.538374 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:24:21.872989 containerd[1703]: time="2026-01-15T00:24:21.872939720Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 00:24:22.202774 containerd[1703]: time="2026-01-15T00:24:22.202630728Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:24:22.204266 containerd[1703]: time="2026-01-15T00:24:22.204227533Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 00:24:22.204395 containerd[1703]: time="2026-01-15T00:24:22.204282773Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 00:24:22.204736 kubelet[2931]: E0115 00:24:22.204526 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:24:22.204736 kubelet[2931]: E0115 00:24:22.204575 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:24:22.205047 kubelet[2931]: E0115 00:24:22.204844 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-568wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7877d5fb5-c885v_calico-system(051b417e-bac4-4f72-8b07-3775d126567f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 00:24:22.205559 containerd[1703]: time="2026-01-15T00:24:22.205340976Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:24:22.206823 kubelet[2931]: E0115 00:24:22.206776 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:24:22.728229 containerd[1703]: time="2026-01-15T00:24:22.728035215Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:24:22.729437 containerd[1703]: time="2026-01-15T00:24:22.729381419Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:24:22.729508 containerd[1703]: time="2026-01-15T00:24:22.729469060Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:24:22.729667 kubelet[2931]: E0115 00:24:22.729626 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:24:22.729729 kubelet[2931]: E0115 00:24:22.729679 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:24:22.729863 kubelet[2931]: E0115 00:24:22.729810 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xck2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-nn9k9_calico-apiserver(933e7fe5-e25e-48cf-938a-716b1fa3d838): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:24:22.731187 kubelet[2931]: E0115 00:24:22.731108 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:24:22.874778 containerd[1703]: time="2026-01-15T00:24:22.874734104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 00:24:23.205843 containerd[1703]: time="2026-01-15T00:24:23.205796037Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:24:23.207868 containerd[1703]: time="2026-01-15T00:24:23.207825203Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 00:24:23.207954 containerd[1703]: time="2026-01-15T00:24:23.207898963Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 00:24:23.209407 kubelet[2931]: E0115 00:24:23.209323 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:24:23.209407 kubelet[2931]: E0115 00:24:23.209403 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:24:23.209714 kubelet[2931]: E0115 00:24:23.209518 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 00:24:23.211381 containerd[1703]: time="2026-01-15T00:24:23.211352734Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 00:24:23.559458 containerd[1703]: time="2026-01-15T00:24:23.559367398Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:24:23.560855 containerd[1703]: time="2026-01-15T00:24:23.560810922Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 00:24:23.560984 containerd[1703]: time="2026-01-15T00:24:23.560896523Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 00:24:23.561068 kubelet[2931]: E0115 00:24:23.561024 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:24:23.561112 kubelet[2931]: E0115 00:24:23.561079 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:24:23.562165 kubelet[2931]: E0115 00:24:23.562100 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 00:24:23.563296 kubelet[2931]: E0115 00:24:23.563248 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:24:23.874279 containerd[1703]: time="2026-01-15T00:24:23.873742400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 00:24:24.203424 containerd[1703]: time="2026-01-15T00:24:24.203223607Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:24:24.204723 containerd[1703]: time="2026-01-15T00:24:24.204659292Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 00:24:24.204825 containerd[1703]: time="2026-01-15T00:24:24.204757292Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 00:24:24.205010 kubelet[2931]: E0115 00:24:24.204972 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:24:24.205066 kubelet[2931]: E0115 00:24:24.205021 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:24:24.205244 kubelet[2931]: E0115 00:24:24.205161 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqmw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cd46w_calico-system(576324d0-4c45-424a-9f35-c0de23b9b1ac): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 00:24:24.206483 kubelet[2931]: E0115 00:24:24.206434 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:24:26.874406 kubelet[2931]: E0115 00:24:26.874316 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:24:29.036069 sshd[5149]: Connection closed by authenticating user root 119.84.148.253 port 52442 [preauth] Jan 15 00:24:29.035000 audit[5149]: USER_ERR pid=5149 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:29.040324 systemd[1]: sshd@14-10.0.3.29:22-119.84.148.253:52442.service: Deactivated successfully. Jan 15 00:24:29.041200 kernel: audit: type=1109 audit(1768436669.035:773): pid=5149 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:29.040000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.29:22-119.84.148.253:52442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:29.045641 kernel: audit: type=1131 audit(1768436669.040:774): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.3.29:22-119.84.148.253:52442 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:29.311290 systemd[1]: Started sshd@15-10.0.3.29:22-119.84.148.253:44394.service - OpenSSH per-connection server daemon (119.84.148.253:44394). Jan 15 00:24:29.310000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.29:22-119.84.148.253:44394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:29.316260 kernel: audit: type=1130 audit(1768436669.310:775): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.29:22-119.84.148.253:44394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:31.155457 sshd[5200]: Connection closed by authenticating user root 119.84.148.253 port 44394 [preauth] Jan 15 00:24:31.155000 audit[5200]: USER_ERR pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:31.158966 systemd[1]: sshd@15-10.0.3.29:22-119.84.148.253:44394.service: Deactivated successfully. Jan 15 00:24:31.159000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.29:22-119.84.148.253:44394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:31.164298 kernel: audit: type=1109 audit(1768436671.155:776): pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:31.165119 kernel: audit: type=1131 audit(1768436671.159:777): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.3.29:22-119.84.148.253:44394 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:31.963105 systemd[1]: Started sshd@16-10.0.3.29:22-119.84.148.253:46444.service - OpenSSH per-connection server daemon (119.84.148.253:46444). Jan 15 00:24:31.962000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.29:22-119.84.148.253:46444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:31.967199 kernel: audit: type=1130 audit(1768436671.962:778): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.29:22-119.84.148.253:46444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:33.877301 kubelet[2931]: E0115 00:24:33.877230 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:24:34.875648 kubelet[2931]: E0115 00:24:34.875420 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:24:34.876152 kubelet[2931]: E0115 00:24:34.875766 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:24:35.135316 sshd[5206]: Connection closed by authenticating user root 119.84.148.253 port 46444 [preauth] Jan 15 00:24:35.134000 audit[5206]: USER_ERR pid=5206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:35.137833 systemd[1]: sshd@16-10.0.3.29:22-119.84.148.253:46444.service: Deactivated successfully. Jan 15 00:24:35.136000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.29:22-119.84.148.253:46444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:35.141662 kernel: audit: type=1109 audit(1768436675.134:779): pid=5206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:35.141727 kernel: audit: type=1131 audit(1768436675.136:780): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.3.29:22-119.84.148.253:46444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:35.315443 systemd[1]: Started sshd@17-10.0.3.29:22-119.84.148.253:48932.service - OpenSSH per-connection server daemon (119.84.148.253:48932). Jan 15 00:24:35.314000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.3.29:22-119.84.148.253:48932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:35.319488 kernel: audit: type=1130 audit(1768436675.314:781): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.3.29:22-119.84.148.253:48932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:35.873124 kubelet[2931]: E0115 00:24:35.873077 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:24:35.874105 kubelet[2931]: E0115 00:24:35.874007 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:24:37.276124 sshd[5213]: Connection closed by authenticating user root 119.84.148.253 port 48932 [preauth] Jan 15 00:24:37.275000 audit[5213]: USER_ERR pid=5213 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:37.279202 systemd[1]: sshd@17-10.0.3.29:22-119.84.148.253:48932.service: Deactivated successfully. Jan 15 00:24:37.279000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.3.29:22-119.84.148.253:48932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:37.282964 kernel: audit: type=1109 audit(1768436677.275:782): pid=5213 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:37.283068 kernel: audit: type=1131 audit(1768436677.279:783): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.3.29:22-119.84.148.253:48932 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:37.527000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.3.29:22-119.84.148.253:51852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:37.528913 systemd[1]: Started sshd@18-10.0.3.29:22-119.84.148.253:51852.service - OpenSSH per-connection server daemon (119.84.148.253:51852). Jan 15 00:24:37.533216 kernel: audit: type=1130 audit(1768436677.527:784): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.3.29:22-119.84.148.253:51852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:37.872741 kubelet[2931]: E0115 00:24:37.872591 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:24:41.289483 sshd[5219]: Connection closed by authenticating user root 119.84.148.253 port 51852 [preauth] Jan 15 00:24:41.288000 audit[5219]: USER_ERR pid=5219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:41.293669 systemd[1]: sshd@18-10.0.3.29:22-119.84.148.253:51852.service: Deactivated successfully. Jan 15 00:24:41.294500 kernel: audit: type=1109 audit(1768436681.288:785): pid=5219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:41.292000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.3.29:22-119.84.148.253:51852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:41.298208 kernel: audit: type=1131 audit(1768436681.292:786): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.3.29:22-119.84.148.253:51852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:41.527536 systemd[1]: Started sshd@19-10.0.3.29:22-119.84.148.253:55416.service - OpenSSH per-connection server daemon (119.84.148.253:55416). Jan 15 00:24:41.526000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.3.29:22-119.84.148.253:55416 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:41.531196 kernel: audit: type=1130 audit(1768436681.526:787): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.3.29:22-119.84.148.253:55416 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:43.312207 sshd[5252]: Connection closed by authenticating user root 119.84.148.253 port 55416 [preauth] Jan 15 00:24:43.311000 audit[5252]: USER_ERR pid=5252 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:43.315690 systemd[1]: sshd@19-10.0.3.29:22-119.84.148.253:55416.service: Deactivated successfully. Jan 15 00:24:43.317503 kernel: audit: type=1109 audit(1768436683.311:788): pid=5252 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:43.317581 kernel: audit: type=1131 audit(1768436683.316:789): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.3.29:22-119.84.148.253:55416 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:43.316000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.3.29:22-119.84.148.253:55416 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:43.572651 systemd[1]: Started sshd@20-10.0.3.29:22-119.84.148.253:57232.service - OpenSSH per-connection server daemon (119.84.148.253:57232). Jan 15 00:24:43.571000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.3.29:22-119.84.148.253:57232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:43.576206 kernel: audit: type=1130 audit(1768436683.571:790): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.3.29:22-119.84.148.253:57232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:44.873768 kubelet[2931]: E0115 00:24:44.873654 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:24:45.389267 sshd[5259]: Connection closed by authenticating user root 119.84.148.253 port 57232 [preauth] Jan 15 00:24:45.388000 audit[5259]: USER_ERR pid=5259 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:45.393635 systemd[1]: sshd@20-10.0.3.29:22-119.84.148.253:57232.service: Deactivated successfully. Jan 15 00:24:45.392000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.3.29:22-119.84.148.253:57232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:45.396959 kernel: audit: type=1109 audit(1768436685.388:791): pid=5259 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:45.397109 kernel: audit: type=1131 audit(1768436685.392:792): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.3.29:22-119.84.148.253:57232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:45.873891 kubelet[2931]: E0115 00:24:45.873781 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:24:46.640118 systemd[1]: Started sshd@21-10.0.3.29:22-119.84.148.253:59058.service - OpenSSH per-connection server daemon (119.84.148.253:59058). Jan 15 00:24:46.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.29:22-119.84.148.253:59058 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:46.645027 kernel: audit: type=1130 audit(1768436686.639:793): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.29:22-119.84.148.253:59058 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:46.873986 kubelet[2931]: E0115 00:24:46.873877 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:24:46.875488 kubelet[2931]: E0115 00:24:46.874043 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:24:50.873247 kubelet[2931]: E0115 00:24:50.872931 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:24:50.873608 kubelet[2931]: E0115 00:24:50.873495 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:24:51.999269 sshd[5267]: Connection closed by authenticating user root 119.84.148.253 port 59058 [preauth] Jan 15 00:24:51.998000 audit[5267]: USER_ERR pid=5267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:52.001537 systemd[1]: sshd@21-10.0.3.29:22-119.84.148.253:59058.service: Deactivated successfully. Jan 15 00:24:52.001000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.29:22-119.84.148.253:59058 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:52.006339 kernel: audit: type=1109 audit(1768436691.998:794): pid=5267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:52.006434 kernel: audit: type=1131 audit(1768436692.001:795): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.3.29:22-119.84.148.253:59058 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:52.095113 systemd[1]: Started sshd@22-10.0.3.29:22-119.84.148.253:35236.service - OpenSSH per-connection server daemon (119.84.148.253:35236). Jan 15 00:24:52.094000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.29:22-119.84.148.253:35236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:52.099207 kernel: audit: type=1130 audit(1768436692.094:796): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.29:22-119.84.148.253:35236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:53.099160 sshd[5273]: Connection closed by authenticating user root 119.84.148.253 port 35236 [preauth] Jan 15 00:24:53.098000 audit[5273]: USER_ERR pid=5273 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:53.101861 systemd[1]: sshd@22-10.0.3.29:22-119.84.148.253:35236.service: Deactivated successfully. Jan 15 00:24:53.101000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.29:22-119.84.148.253:35236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:53.106020 kernel: audit: type=1109 audit(1768436693.098:797): pid=5273 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:53.106075 kernel: audit: type=1131 audit(1768436693.101:798): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.3.29:22-119.84.148.253:35236 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:53.364321 systemd[1]: Started sshd@23-10.0.3.29:22-119.84.148.253:37724.service - OpenSSH per-connection server daemon (119.84.148.253:37724). Jan 15 00:24:53.363000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.29:22-119.84.148.253:37724 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:53.368244 kernel: audit: type=1130 audit(1768436693.363:799): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.29:22-119.84.148.253:37724 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:54.382599 sshd[5279]: Connection closed by authenticating user root 119.84.148.253 port 37724 [preauth] Jan 15 00:24:54.381000 audit[5279]: USER_ERR pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:54.385126 systemd[1]: sshd@23-10.0.3.29:22-119.84.148.253:37724.service: Deactivated successfully. Jan 15 00:24:54.384000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.29:22-119.84.148.253:37724 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:54.389095 kernel: audit: type=1109 audit(1768436694.381:800): pid=5279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:54.389250 kernel: audit: type=1131 audit(1768436694.384:801): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.3.29:22-119.84.148.253:37724 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:54.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.29:22-119.84.148.253:38898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:54.642473 systemd[1]: Started sshd@24-10.0.3.29:22-119.84.148.253:38898.service - OpenSSH per-connection server daemon (119.84.148.253:38898). Jan 15 00:24:54.650218 kernel: audit: type=1130 audit(1768436694.641:802): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.29:22-119.84.148.253:38898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:56.873778 kubelet[2931]: E0115 00:24:56.873707 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:24:57.031283 sshd[5285]: Connection closed by authenticating user root 119.84.148.253 port 38898 [preauth] Jan 15 00:24:57.031000 audit[5285]: USER_ERR pid=5285 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:57.034511 systemd[1]: sshd@24-10.0.3.29:22-119.84.148.253:38898.service: Deactivated successfully. Jan 15 00:24:57.033000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.29:22-119.84.148.253:38898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:57.038454 kernel: audit: type=1109 audit(1768436697.031:803): pid=5285 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:24:57.038565 kernel: audit: type=1131 audit(1768436697.033:804): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.3.29:22-119.84.148.253:38898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:57.716000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.29:22-119.84.148.253:40728 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:57.718029 systemd[1]: Started sshd@25-10.0.3.29:22-119.84.148.253:40728.service - OpenSSH per-connection server daemon (119.84.148.253:40728). Jan 15 00:24:57.721196 kernel: audit: type=1130 audit(1768436697.716:805): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.29:22-119.84.148.253:40728 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:24:57.874657 kubelet[2931]: E0115 00:24:57.874537 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:24:58.875077 kubelet[2931]: E0115 00:24:58.875019 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:25:00.873541 kubelet[2931]: E0115 00:25:00.873452 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:25:01.873894 containerd[1703]: time="2026-01-15T00:25:01.873815950Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:25:02.132499 sshd[5291]: Connection closed by authenticating user root 119.84.148.253 port 40728 [preauth] Jan 15 00:25:02.131000 audit[5291]: USER_ERR pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:02.134936 systemd[1]: sshd@25-10.0.3.29:22-119.84.148.253:40728.service: Deactivated successfully. Jan 15 00:25:02.134000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.29:22-119.84.148.253:40728 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:02.139091 kernel: audit: type=1109 audit(1768436702.131:806): pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:02.139159 kernel: audit: type=1131 audit(1768436702.134:807): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.3.29:22-119.84.148.253:40728 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:02.417313 containerd[1703]: time="2026-01-15T00:25:02.416648690Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:25:02.418568 containerd[1703]: time="2026-01-15T00:25:02.418516176Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:25:02.418673 containerd[1703]: time="2026-01-15T00:25:02.418533376Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:25:02.418782 kubelet[2931]: E0115 00:25:02.418743 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:25:02.419038 kubelet[2931]: E0115 00:25:02.418792 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:25:02.419038 kubelet[2931]: E0115 00:25:02.418909 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5q7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-v4vh7_calico-apiserver(782417a5-ecd0-40c5-85c0-45ead5d347fd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:25:02.420540 kubelet[2931]: E0115 00:25:02.420414 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:25:03.387931 systemd[1]: Started sshd@26-10.0.3.29:22-119.84.148.253:45344.service - OpenSSH per-connection server daemon (119.84.148.253:45344). Jan 15 00:25:03.387000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.3.29:22-119.84.148.253:45344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:03.394246 kernel: audit: type=1130 audit(1768436703.387:808): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.3.29:22-119.84.148.253:45344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:03.873132 containerd[1703]: time="2026-01-15T00:25:03.873054225Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:25:05.325698 containerd[1703]: time="2026-01-15T00:25:05.325624868Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:25:05.327433 containerd[1703]: time="2026-01-15T00:25:05.327298353Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:25:05.327433 containerd[1703]: time="2026-01-15T00:25:05.327379433Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:25:05.327642 kubelet[2931]: E0115 00:25:05.327522 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:25:05.327642 kubelet[2931]: E0115 00:25:05.327570 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:25:05.327906 kubelet[2931]: E0115 00:25:05.327713 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xck2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-nn9k9_calico-apiserver(933e7fe5-e25e-48cf-938a-716b1fa3d838): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:25:05.329112 kubelet[2931]: E0115 00:25:05.329037 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:25:06.892715 sshd[5303]: Connection closed by authenticating user root 119.84.148.253 port 45344 [preauth] Jan 15 00:25:06.892000 audit[5303]: USER_ERR pid=5303 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:06.897371 systemd[1]: sshd@26-10.0.3.29:22-119.84.148.253:45344.service: Deactivated successfully. Jan 15 00:25:06.896000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.3.29:22-119.84.148.253:45344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:06.901092 kernel: audit: type=1109 audit(1768436706.892:809): pid=5303 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:06.901168 kernel: audit: type=1131 audit(1768436706.896:810): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.3.29:22-119.84.148.253:45344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:08.876290 containerd[1703]: time="2026-01-15T00:25:08.875335485Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 00:25:09.258774 containerd[1703]: time="2026-01-15T00:25:09.258578178Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:25:09.259860 containerd[1703]: time="2026-01-15T00:25:09.259820661Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 00:25:09.259929 containerd[1703]: time="2026-01-15T00:25:09.259900382Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 00:25:09.260231 kubelet[2931]: E0115 00:25:09.260054 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:25:09.260231 kubelet[2931]: E0115 00:25:09.260110 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:25:09.260506 kubelet[2931]: E0115 00:25:09.260219 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:956fb8f8794a4941a07f54797f4e6220,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 00:25:09.262209 containerd[1703]: time="2026-01-15T00:25:09.262069508Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 00:25:09.583000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.3.29:22-119.84.148.253:49046 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:09.584823 systemd[1]: Started sshd@27-10.0.3.29:22-119.84.148.253:49046.service - OpenSSH per-connection server daemon (119.84.148.253:49046). Jan 15 00:25:09.589221 kernel: audit: type=1130 audit(1768436709.583:811): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.3.29:22-119.84.148.253:49046 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:09.592940 containerd[1703]: time="2026-01-15T00:25:09.592884200Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:25:09.594793 containerd[1703]: time="2026-01-15T00:25:09.594746166Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 00:25:09.594853 containerd[1703]: time="2026-01-15T00:25:09.594805526Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 00:25:09.595080 kubelet[2931]: E0115 00:25:09.595040 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:25:09.595149 kubelet[2931]: E0115 00:25:09.595091 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:25:09.595555 kubelet[2931]: E0115 00:25:09.595217 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 00:25:09.596520 kubelet[2931]: E0115 00:25:09.596391 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:25:11.873438 containerd[1703]: time="2026-01-15T00:25:11.873344055Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 00:25:12.248993 containerd[1703]: time="2026-01-15T00:25:12.248870284Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:25:12.249962 containerd[1703]: time="2026-01-15T00:25:12.249897007Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 00:25:12.250024 containerd[1703]: time="2026-01-15T00:25:12.249990567Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 00:25:12.250566 kubelet[2931]: E0115 00:25:12.250525 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:25:12.250872 kubelet[2931]: E0115 00:25:12.250579 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:25:12.250872 kubelet[2931]: E0115 00:25:12.250813 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 00:25:12.251438 containerd[1703]: time="2026-01-15T00:25:12.251194291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 00:25:12.600197 containerd[1703]: time="2026-01-15T00:25:12.599317156Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:25:12.601804 containerd[1703]: time="2026-01-15T00:25:12.601735163Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 00:25:12.601907 containerd[1703]: time="2026-01-15T00:25:12.601839924Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 00:25:12.602088 kubelet[2931]: E0115 00:25:12.602048 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:25:12.602225 kubelet[2931]: E0115 00:25:12.602207 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:25:12.602779 kubelet[2931]: E0115 00:25:12.602468 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-568wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7877d5fb5-c885v_calico-system(051b417e-bac4-4f72-8b07-3775d126567f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 00:25:12.602914 containerd[1703]: time="2026-01-15T00:25:12.602567726Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 00:25:12.604188 kubelet[2931]: E0115 00:25:12.604102 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:25:12.652115 sshd[5337]: Connection closed by authenticating user root 119.84.148.253 port 49046 [preauth] Jan 15 00:25:12.651000 audit[5337]: USER_ERR pid=5337 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:12.654258 systemd[1]: sshd@27-10.0.3.29:22-119.84.148.253:49046.service: Deactivated successfully. Jan 15 00:25:12.653000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.3.29:22-119.84.148.253:49046 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:12.660035 kernel: audit: type=1109 audit(1768436712.651:812): pid=5337 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:12.660118 kernel: audit: type=1131 audit(1768436712.653:813): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-10.0.3.29:22-119.84.148.253:49046 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:12.949795 containerd[1703]: time="2026-01-15T00:25:12.949670668Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:25:12.951037 containerd[1703]: time="2026-01-15T00:25:12.950969952Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 00:25:12.951142 containerd[1703]: time="2026-01-15T00:25:12.951009232Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 00:25:12.951377 kubelet[2931]: E0115 00:25:12.951331 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:25:12.951436 kubelet[2931]: E0115 00:25:12.951386 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:25:12.951544 kubelet[2931]: E0115 00:25:12.951503 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 00:25:12.952739 kubelet[2931]: E0115 00:25:12.952689 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:25:13.323000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.3.29:22-119.84.148.253:54662 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:13.324954 systemd[1]: Started sshd@28-10.0.3.29:22-119.84.148.253:54662.service - OpenSSH per-connection server daemon (119.84.148.253:54662). Jan 15 00:25:13.329219 kernel: audit: type=1130 audit(1768436713.323:814): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.3.29:22-119.84.148.253:54662 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:13.872880 kubelet[2931]: E0115 00:25:13.872830 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:25:13.873285 containerd[1703]: time="2026-01-15T00:25:13.873014212Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 00:25:14.204483 containerd[1703]: time="2026-01-15T00:25:14.204329665Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:25:14.206035 containerd[1703]: time="2026-01-15T00:25:14.205941230Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 00:25:14.206035 containerd[1703]: time="2026-01-15T00:25:14.205982950Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 00:25:14.206793 kubelet[2931]: E0115 00:25:14.206145 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:25:14.207098 kubelet[2931]: E0115 00:25:14.206887 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:25:14.207098 kubelet[2931]: E0115 00:25:14.207030 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqmw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cd46w_calico-system(576324d0-4c45-424a-9f35-c0de23b9b1ac): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 00:25:14.208563 kubelet[2931]: E0115 00:25:14.208523 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:25:16.873030 kubelet[2931]: E0115 00:25:16.872956 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:25:19.238609 sshd[5351]: Connection closed by authenticating user root 119.84.148.253 port 54662 [preauth] Jan 15 00:25:19.238000 audit[5351]: USER_ERR pid=5351 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:19.241534 systemd[1]: sshd@28-10.0.3.29:22-119.84.148.253:54662.service: Deactivated successfully. Jan 15 00:25:19.240000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.3.29:22-119.84.148.253:54662 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:19.246297 kernel: audit: type=1109 audit(1768436719.238:815): pid=5351 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:19.246391 kernel: audit: type=1131 audit(1768436719.240:816): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-10.0.3.29:22-119.84.148.253:54662 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:20.048318 systemd[1]: Started sshd@29-10.0.3.29:22-119.84.148.253:32872.service - OpenSSH per-connection server daemon (119.84.148.253:32872). Jan 15 00:25:20.047000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.3.29:22-119.84.148.253:32872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:20.053227 kernel: audit: type=1130 audit(1768436720.047:817): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.3.29:22-119.84.148.253:32872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:21.895308 sshd[5372]: Connection closed by authenticating user root 119.84.148.253 port 32872 [preauth] Jan 15 00:25:21.894000 audit[5372]: USER_ERR pid=5372 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:21.900444 systemd[1]: sshd@29-10.0.3.29:22-119.84.148.253:32872.service: Deactivated successfully. Jan 15 00:25:21.899000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.3.29:22-119.84.148.253:32872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:21.904708 kernel: audit: type=1109 audit(1768436721.894:818): pid=5372 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=119.84.148.253 addr=119.84.148.253 terminal=ssh res=failed' Jan 15 00:25:21.904787 kernel: audit: type=1131 audit(1768436721.899:819): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-10.0.3.29:22-119.84.148.253:32872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:25:24.875226 kubelet[2931]: E0115 00:25:24.875119 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:25:25.873491 kubelet[2931]: E0115 00:25:25.873438 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:25:26.873694 kubelet[2931]: E0115 00:25:26.872692 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:25:27.873264 kubelet[2931]: E0115 00:25:27.872978 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:25:27.874417 kubelet[2931]: E0115 00:25:27.874316 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:25:29.872875 kubelet[2931]: E0115 00:25:29.872539 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:25:35.874936 kubelet[2931]: E0115 00:25:35.874859 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:25:36.876750 kubelet[2931]: E0115 00:25:36.876357 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:25:38.875022 kubelet[2931]: E0115 00:25:38.874962 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:25:39.873619 kubelet[2931]: E0115 00:25:39.873574 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:25:39.873848 kubelet[2931]: E0115 00:25:39.873801 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:25:41.873567 kubelet[2931]: E0115 00:25:41.873481 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:25:48.874073 kubelet[2931]: E0115 00:25:48.874019 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:25:48.875055 kubelet[2931]: E0115 00:25:48.874992 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:25:50.875223 kubelet[2931]: E0115 00:25:50.875148 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:25:50.875816 kubelet[2931]: E0115 00:25:50.875777 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:25:51.872762 kubelet[2931]: E0115 00:25:51.872718 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:25:54.872722 kubelet[2931]: E0115 00:25:54.872567 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:25:59.873397 kubelet[2931]: E0115 00:25:59.873313 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:26:02.873654 kubelet[2931]: E0115 00:26:02.873581 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:26:02.874798 kubelet[2931]: E0115 00:26:02.874759 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:26:03.873231 kubelet[2931]: E0115 00:26:03.873133 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:26:05.873444 kubelet[2931]: E0115 00:26:05.873347 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:26:07.873637 kubelet[2931]: E0115 00:26:07.873571 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:26:13.875080 kubelet[2931]: E0115 00:26:13.875004 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:26:15.874484 kubelet[2931]: E0115 00:26:15.874436 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:26:16.874828 kubelet[2931]: E0115 00:26:16.874660 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:26:17.873583 kubelet[2931]: E0115 00:26:17.873407 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:26:18.874382 kubelet[2931]: E0115 00:26:18.874317 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:26:18.875386 kubelet[2931]: E0115 00:26:18.875340 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:26:28.874940 kubelet[2931]: E0115 00:26:28.874802 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:26:28.875467 kubelet[2931]: E0115 00:26:28.874997 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:26:29.875394 containerd[1703]: time="2026-01-15T00:26:29.875103479Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:26:30.212656 containerd[1703]: time="2026-01-15T00:26:30.212465750Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:26:30.214042 containerd[1703]: time="2026-01-15T00:26:30.213994595Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:26:30.214150 containerd[1703]: time="2026-01-15T00:26:30.214016995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:26:30.214377 kubelet[2931]: E0115 00:26:30.214333 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:26:30.214682 kubelet[2931]: E0115 00:26:30.214391 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:26:30.214682 kubelet[2931]: E0115 00:26:30.214510 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5q7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-v4vh7_calico-apiserver(782417a5-ecd0-40c5-85c0-45ead5d347fd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:26:30.215878 kubelet[2931]: E0115 00:26:30.215821 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:26:31.873946 kubelet[2931]: E0115 00:26:31.873884 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:26:32.873869 containerd[1703]: time="2026-01-15T00:26:32.873833851Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 00:26:33.209355 containerd[1703]: time="2026-01-15T00:26:33.209203997Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:26:33.210455 containerd[1703]: time="2026-01-15T00:26:33.210416120Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 00:26:33.210572 containerd[1703]: time="2026-01-15T00:26:33.210491401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 00:26:33.210689 kubelet[2931]: E0115 00:26:33.210635 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:26:33.210689 kubelet[2931]: E0115 00:26:33.210691 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:26:33.210994 kubelet[2931]: E0115 00:26:33.210810 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-568wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7877d5fb5-c885v_calico-system(051b417e-bac4-4f72-8b07-3775d126567f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 00:26:33.212037 kubelet[2931]: E0115 00:26:33.211993 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:26:33.873311 containerd[1703]: time="2026-01-15T00:26:33.873259428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:26:34.206647 containerd[1703]: time="2026-01-15T00:26:34.206521527Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:26:34.208100 containerd[1703]: time="2026-01-15T00:26:34.208044332Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:26:34.208204 containerd[1703]: time="2026-01-15T00:26:34.208045452Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:26:34.208397 kubelet[2931]: E0115 00:26:34.208299 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:26:34.208397 kubelet[2931]: E0115 00:26:34.208350 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:26:34.208543 kubelet[2931]: E0115 00:26:34.208494 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xck2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-nn9k9_calico-apiserver(933e7fe5-e25e-48cf-938a-716b1fa3d838): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:26:34.210008 kubelet[2931]: E0115 00:26:34.209653 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:26:40.873762 containerd[1703]: time="2026-01-15T00:26:40.873523359Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 00:26:41.214711 containerd[1703]: time="2026-01-15T00:26:41.214583323Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:26:41.216359 containerd[1703]: time="2026-01-15T00:26:41.216246248Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 00:26:41.216359 containerd[1703]: time="2026-01-15T00:26:41.216304568Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 00:26:41.216516 kubelet[2931]: E0115 00:26:41.216464 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:26:41.216847 kubelet[2931]: E0115 00:26:41.216520 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:26:41.217001 kubelet[2931]: E0115 00:26:41.216856 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:956fb8f8794a4941a07f54797f4e6220,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 00:26:41.217547 containerd[1703]: time="2026-01-15T00:26:41.217346691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 00:26:41.560919 containerd[1703]: time="2026-01-15T00:26:41.560859422Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:26:41.562226 containerd[1703]: time="2026-01-15T00:26:41.562106226Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 00:26:41.562315 containerd[1703]: time="2026-01-15T00:26:41.562210426Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 00:26:41.562605 kubelet[2931]: E0115 00:26:41.562537 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:26:41.562794 kubelet[2931]: E0115 00:26:41.562633 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:26:41.563060 kubelet[2931]: E0115 00:26:41.562991 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqmw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cd46w_calico-system(576324d0-4c45-424a-9f35-c0de23b9b1ac): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 00:26:41.563662 containerd[1703]: time="2026-01-15T00:26:41.563428030Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 00:26:41.565234 kubelet[2931]: E0115 00:26:41.565190 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:26:41.872662 kubelet[2931]: E0115 00:26:41.872535 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:26:41.897546 containerd[1703]: time="2026-01-15T00:26:41.897169210Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:26:41.898659 containerd[1703]: time="2026-01-15T00:26:41.898593655Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 00:26:41.899214 containerd[1703]: time="2026-01-15T00:26:41.898606655Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 00:26:41.899277 kubelet[2931]: E0115 00:26:41.898951 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:26:41.899277 kubelet[2931]: E0115 00:26:41.898998 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:26:41.899277 kubelet[2931]: E0115 00:26:41.899140 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 00:26:41.900392 kubelet[2931]: E0115 00:26:41.900330 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:26:44.873188 kubelet[2931]: E0115 00:26:44.872798 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:26:45.873674 containerd[1703]: time="2026-01-15T00:26:45.873636813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 00:26:46.199616 containerd[1703]: time="2026-01-15T00:26:46.199383330Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:26:46.201191 containerd[1703]: time="2026-01-15T00:26:46.201019175Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 00:26:46.201373 containerd[1703]: time="2026-01-15T00:26:46.201259095Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 00:26:46.201499 kubelet[2931]: E0115 00:26:46.201459 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:26:46.201773 kubelet[2931]: E0115 00:26:46.201510 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:26:46.201773 kubelet[2931]: E0115 00:26:46.201617 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 00:26:46.203499 containerd[1703]: time="2026-01-15T00:26:46.203475862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 00:26:46.540991 containerd[1703]: time="2026-01-15T00:26:46.540939694Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:26:46.542344 containerd[1703]: time="2026-01-15T00:26:46.542299738Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 00:26:46.542545 containerd[1703]: time="2026-01-15T00:26:46.542393739Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 00:26:46.542603 kubelet[2931]: E0115 00:26:46.542553 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:26:46.542710 kubelet[2931]: E0115 00:26:46.542614 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:26:46.542834 kubelet[2931]: E0115 00:26:46.542723 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 00:26:46.543922 kubelet[2931]: E0115 00:26:46.543893 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:26:47.873771 kubelet[2931]: E0115 00:26:47.873709 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:26:53.873109 kubelet[2931]: E0115 00:26:53.873048 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:26:53.873109 kubelet[2931]: E0115 00:26:53.873057 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:26:55.874951 kubelet[2931]: E0115 00:26:55.873977 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:26:58.873137 kubelet[2931]: E0115 00:26:58.873088 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:26:59.875630 kubelet[2931]: E0115 00:26:59.875572 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:27:01.873628 kubelet[2931]: E0115 00:27:01.873579 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:27:04.873485 kubelet[2931]: E0115 00:27:04.873406 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:27:05.875981 kubelet[2931]: E0115 00:27:05.874617 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:27:10.877433 kubelet[2931]: E0115 00:27:10.875327 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:27:12.875382 kubelet[2931]: E0115 00:27:12.875323 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:27:13.872779 kubelet[2931]: E0115 00:27:13.872736 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:27:16.875070 kubelet[2931]: E0115 00:27:16.874667 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:27:16.875070 kubelet[2931]: E0115 00:27:16.874836 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:27:17.873268 kubelet[2931]: E0115 00:27:17.873208 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:27:25.873547 kubelet[2931]: E0115 00:27:25.873485 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:27:27.873862 kubelet[2931]: E0115 00:27:27.873260 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:27:28.873193 kubelet[2931]: E0115 00:27:28.873087 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:27:28.874261 kubelet[2931]: E0115 00:27:28.874215 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:27:30.873413 kubelet[2931]: E0115 00:27:30.873198 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:27:32.872793 kubelet[2931]: E0115 00:27:32.872703 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:27:34.334335 containerd[1703]: time="2026-01-15T00:27:34.334226799Z" level=info msg="container event discarded" container=0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b type=CONTAINER_CREATED_EVENT Jan 15 00:27:34.334335 containerd[1703]: time="2026-01-15T00:27:34.334298719Z" level=info msg="container event discarded" container=0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b type=CONTAINER_STARTED_EVENT Jan 15 00:27:34.346556 containerd[1703]: time="2026-01-15T00:27:34.346498757Z" level=info msg="container event discarded" container=97b986f076ca85bd12656b191071cfe79b9479c2527fbb64cef116a95567d845 type=CONTAINER_CREATED_EVENT Jan 15 00:27:34.346556 containerd[1703]: time="2026-01-15T00:27:34.346545957Z" level=info msg="container event discarded" container=97b986f076ca85bd12656b191071cfe79b9479c2527fbb64cef116a95567d845 type=CONTAINER_STARTED_EVENT Jan 15 00:27:34.346556 containerd[1703]: time="2026-01-15T00:27:34.346555437Z" level=info msg="container event discarded" container=08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876 type=CONTAINER_CREATED_EVENT Jan 15 00:27:34.346556 containerd[1703]: time="2026-01-15T00:27:34.346563317Z" level=info msg="container event discarded" container=08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876 type=CONTAINER_STARTED_EVENT Jan 15 00:27:34.385329 containerd[1703]: time="2026-01-15T00:27:34.385271875Z" level=info msg="container event discarded" container=66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a type=CONTAINER_CREATED_EVENT Jan 15 00:27:34.423288 containerd[1703]: time="2026-01-15T00:27:34.423226911Z" level=info msg="container event discarded" container=265bcc13ef0763b693906fa95a92ae3d97b907a0d0e9bef37d63480e77f6db77 type=CONTAINER_CREATED_EVENT Jan 15 00:27:34.442456 containerd[1703]: time="2026-01-15T00:27:34.442403010Z" level=info msg="container event discarded" container=652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e type=CONTAINER_CREATED_EVENT Jan 15 00:27:34.542215 containerd[1703]: time="2026-01-15T00:27:34.542045915Z" level=info msg="container event discarded" container=265bcc13ef0763b693906fa95a92ae3d97b907a0d0e9bef37d63480e77f6db77 type=CONTAINER_STARTED_EVENT Jan 15 00:27:34.542215 containerd[1703]: time="2026-01-15T00:27:34.542147155Z" level=info msg="container event discarded" container=66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a type=CONTAINER_STARTED_EVENT Jan 15 00:27:34.542215 containerd[1703]: time="2026-01-15T00:27:34.542209355Z" level=info msg="container event discarded" container=652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e type=CONTAINER_STARTED_EVENT Jan 15 00:27:39.872459 kubelet[2931]: E0115 00:27:39.872401 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:27:39.872852 kubelet[2931]: E0115 00:27:39.872805 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:27:39.874375 kubelet[2931]: E0115 00:27:39.874339 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:27:40.270294 systemd[1824]: Created slice background.slice - User Background Tasks Slice. Jan 15 00:27:40.271754 systemd[1824]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Jan 15 00:27:40.291195 systemd[1824]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Jan 15 00:27:43.872738 kubelet[2931]: E0115 00:27:43.872691 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:27:43.874637 kubelet[2931]: E0115 00:27:43.874423 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:27:44.872481 kubelet[2931]: E0115 00:27:44.872410 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:27:46.211772 containerd[1703]: time="2026-01-15T00:27:46.211594328Z" level=info msg="container event discarded" container=8fd85773f4a3a1bdb5c282f19c1ff8010859b09733c7170db160cce2e7d4dadd type=CONTAINER_CREATED_EVENT Jan 15 00:27:46.212402 containerd[1703]: time="2026-01-15T00:27:46.212355651Z" level=info msg="container event discarded" container=8fd85773f4a3a1bdb5c282f19c1ff8010859b09733c7170db160cce2e7d4dadd type=CONTAINER_STARTED_EVENT Jan 15 00:27:46.231692 containerd[1703]: time="2026-01-15T00:27:46.231621630Z" level=info msg="container event discarded" container=b956648cffd3739ea0ffaceed79148c37448843faa2ca8ea13d7cf45c8296e1e type=CONTAINER_CREATED_EVENT Jan 15 00:27:46.334157 containerd[1703]: time="2026-01-15T00:27:46.334091543Z" level=info msg="container event discarded" container=b956648cffd3739ea0ffaceed79148c37448843faa2ca8ea13d7cf45c8296e1e type=CONTAINER_STARTED_EVENT Jan 15 00:27:46.536386 containerd[1703]: time="2026-01-15T00:27:46.536306762Z" level=info msg="container event discarded" container=6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677 type=CONTAINER_CREATED_EVENT Jan 15 00:27:46.536386 containerd[1703]: time="2026-01-15T00:27:46.536372162Z" level=info msg="container event discarded" container=6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677 type=CONTAINER_STARTED_EVENT Jan 15 00:27:48.737429 containerd[1703]: time="2026-01-15T00:27:48.737059213Z" level=info msg="container event discarded" container=74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195 type=CONTAINER_CREATED_EVENT Jan 15 00:27:48.788342 containerd[1703]: time="2026-01-15T00:27:48.788260890Z" level=info msg="container event discarded" container=74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195 type=CONTAINER_STARTED_EVENT Jan 15 00:27:50.873272 kubelet[2931]: E0115 00:27:50.873207 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:27:50.873272 kubelet[2931]: E0115 00:27:50.873219 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:27:52.874397 kubelet[2931]: E0115 00:27:52.874261 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:27:55.873505 kubelet[2931]: E0115 00:27:55.873452 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:27:56.873779 kubelet[2931]: E0115 00:27:56.873731 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:27:58.874299 kubelet[2931]: E0115 00:27:58.874221 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:28:02.165004 containerd[1703]: time="2026-01-15T00:28:02.164897525Z" level=info msg="container event discarded" container=5d26872dffc67f8a3885c2f6f881025649956d123c9f56e5f65f74713df04853 type=CONTAINER_CREATED_EVENT Jan 15 00:28:02.165004 containerd[1703]: time="2026-01-15T00:28:02.164991925Z" level=info msg="container event discarded" container=5d26872dffc67f8a3885c2f6f881025649956d123c9f56e5f65f74713df04853 type=CONTAINER_STARTED_EVENT Jan 15 00:28:02.316422 containerd[1703]: time="2026-01-15T00:28:02.316322948Z" level=info msg="container event discarded" container=d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e type=CONTAINER_CREATED_EVENT Jan 15 00:28:02.316422 containerd[1703]: time="2026-01-15T00:28:02.316393588Z" level=info msg="container event discarded" container=d6ae2bcad0acad21822d5e685815846b96808ca2fac64e63071a36e05055577e type=CONTAINER_STARTED_EVENT Jan 15 00:28:03.872836 kubelet[2931]: E0115 00:28:03.872739 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:28:04.873083 kubelet[2931]: E0115 00:28:04.872993 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:28:04.873486 kubelet[2931]: E0115 00:28:04.873103 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:28:07.873412 kubelet[2931]: E0115 00:28:07.873325 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:28:08.874974 kubelet[2931]: E0115 00:28:08.874935 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:28:10.872611 kubelet[2931]: E0115 00:28:10.872558 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:28:15.873824 kubelet[2931]: E0115 00:28:15.873055 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:28:17.427512 update_engine[1682]: I20260115 00:28:17.427443 1682 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 15 00:28:17.427512 update_engine[1682]: I20260115 00:28:17.427495 1682 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 15 00:28:17.427877 update_engine[1682]: I20260115 00:28:17.427728 1682 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 15 00:28:17.428104 update_engine[1682]: I20260115 00:28:17.428061 1682 omaha_request_params.cc:62] Current group set to beta Jan 15 00:28:17.428183 update_engine[1682]: I20260115 00:28:17.428157 1682 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 15 00:28:17.428223 update_engine[1682]: I20260115 00:28:17.428187 1682 update_attempter.cc:643] Scheduling an action processor start. Jan 15 00:28:17.428223 update_engine[1682]: I20260115 00:28:17.428206 1682 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 15 00:28:17.428653 update_engine[1682]: I20260115 00:28:17.428454 1682 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 15 00:28:17.428653 update_engine[1682]: I20260115 00:28:17.428515 1682 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 15 00:28:17.428653 update_engine[1682]: I20260115 00:28:17.428524 1682 omaha_request_action.cc:272] Request: Jan 15 00:28:17.428653 update_engine[1682]: Jan 15 00:28:17.428653 update_engine[1682]: Jan 15 00:28:17.428653 update_engine[1682]: Jan 15 00:28:17.428653 update_engine[1682]: Jan 15 00:28:17.428653 update_engine[1682]: Jan 15 00:28:17.428653 update_engine[1682]: Jan 15 00:28:17.428653 update_engine[1682]: Jan 15 00:28:17.428653 update_engine[1682]: Jan 15 00:28:17.428653 update_engine[1682]: I20260115 00:28:17.428530 1682 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 00:28:17.428913 locksmithd[1735]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 15 00:28:17.430852 update_engine[1682]: I20260115 00:28:17.430248 1682 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 00:28:17.431149 update_engine[1682]: I20260115 00:28:17.431113 1682 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 00:28:17.439696 update_engine[1682]: E20260115 00:28:17.439632 1682 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 00:28:17.439881 update_engine[1682]: I20260115 00:28:17.439853 1682 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 15 00:28:17.873592 kubelet[2931]: E0115 00:28:17.873543 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:28:18.257572 containerd[1703]: time="2026-01-15T00:28:18.257426427Z" level=info msg="container event discarded" container=ec0a98d515673948bef6d6782c87624d365ea7ad3486de8b7b92440eaf6b64dc type=CONTAINER_CREATED_EVENT Jan 15 00:28:18.337807 containerd[1703]: time="2026-01-15T00:28:18.337698352Z" level=info msg="container event discarded" container=ec0a98d515673948bef6d6782c87624d365ea7ad3486de8b7b92440eaf6b64dc type=CONTAINER_STARTED_EVENT Jan 15 00:28:18.874035 kubelet[2931]: E0115 00:28:18.873657 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:28:19.873085 kubelet[2931]: E0115 00:28:19.873010 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:28:20.112946 containerd[1703]: time="2026-01-15T00:28:20.112878382Z" level=info msg="container event discarded" container=97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a type=CONTAINER_CREATED_EVENT Jan 15 00:28:20.230502 containerd[1703]: time="2026-01-15T00:28:20.230345421Z" level=info msg="container event discarded" container=97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a type=CONTAINER_STARTED_EVENT Jan 15 00:28:21.504309 containerd[1703]: time="2026-01-15T00:28:21.504241198Z" level=info msg="container event discarded" container=97d2b1c80b745b4488e0a3c6190fa7160c5950f64e5b156b8bbfcc73533d151a type=CONTAINER_STOPPED_EVENT Jan 15 00:28:21.873780 kubelet[2931]: E0115 00:28:21.873260 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:28:25.242587 containerd[1703]: time="2026-01-15T00:28:25.242532312Z" level=info msg="container event discarded" container=690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28 type=CONTAINER_CREATED_EVENT Jan 15 00:28:25.373500 containerd[1703]: time="2026-01-15T00:28:25.373434312Z" level=info msg="container event discarded" container=690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28 type=CONTAINER_STARTED_EVENT Jan 15 00:28:25.873077 kubelet[2931]: E0115 00:28:25.872776 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:28:26.099336 systemd[1]: Started sshd@30-10.0.3.29:22-146.190.22.154:58374.service - OpenSSH per-connection server daemon (146.190.22.154:58374). Jan 15 00:28:26.099000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.3.29:22-146.190.22.154:58374 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:28:26.103256 kernel: audit: type=1130 audit(1768436906.099:820): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.3.29:22-146.190.22.154:58374 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:28:26.156361 sshd[5612]: Connection closed by 146.190.22.154 port 58374 Jan 15 00:28:26.157558 systemd[1]: sshd@30-10.0.3.29:22-146.190.22.154:58374.service: Deactivated successfully. Jan 15 00:28:26.157000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.3.29:22-146.190.22.154:58374 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:28:26.162202 kernel: audit: type=1131 audit(1768436906.157:821): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-10.0.3.29:22-146.190.22.154:58374 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:28:26.873808 kubelet[2931]: E0115 00:28:26.873744 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:28:27.340289 update_engine[1682]: I20260115 00:28:27.340135 1682 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 00:28:27.340661 update_engine[1682]: I20260115 00:28:27.340356 1682 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 00:28:27.341112 update_engine[1682]: I20260115 00:28:27.341056 1682 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 00:28:27.347997 update_engine[1682]: E20260115 00:28:27.347949 1682 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 00:28:27.348069 update_engine[1682]: I20260115 00:28:27.348026 1682 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 15 00:28:28.108421 containerd[1703]: time="2026-01-15T00:28:28.108361238Z" level=info msg="container event discarded" container=690764bbfa7684e72300a3b6e1ef5e8108bc9089bd283608ef868c80b45b7c28 type=CONTAINER_STOPPED_EVENT Jan 15 00:28:31.873397 kubelet[2931]: E0115 00:28:31.873355 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:28:32.874612 kubelet[2931]: E0115 00:28:32.874533 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:28:32.876307 kubelet[2931]: E0115 00:28:32.875053 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:28:36.005135 containerd[1703]: time="2026-01-15T00:28:36.004978511Z" level=info msg="container event discarded" container=2fcaf57cf90f04b7899d5cef15df33368aacd5e0cf13be3d22a705c256f86c0c type=CONTAINER_CREATED_EVENT Jan 15 00:28:36.143460 containerd[1703]: time="2026-01-15T00:28:36.143392094Z" level=info msg="container event discarded" container=2fcaf57cf90f04b7899d5cef15df33368aacd5e0cf13be3d22a705c256f86c0c type=CONTAINER_STARTED_EVENT Jan 15 00:28:36.873196 kubelet[2931]: E0115 00:28:36.872892 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:28:37.340313 update_engine[1682]: I20260115 00:28:37.340016 1682 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 00:28:37.340313 update_engine[1682]: I20260115 00:28:37.340159 1682 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 00:28:37.340864 update_engine[1682]: I20260115 00:28:37.340804 1682 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 00:28:37.349084 update_engine[1682]: E20260115 00:28:37.348982 1682 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 00:28:37.349396 update_engine[1682]: I20260115 00:28:37.349149 1682 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 15 00:28:37.872727 kubelet[2931]: E0115 00:28:37.872631 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:28:39.075064 containerd[1703]: time="2026-01-15T00:28:39.074990861Z" level=info msg="container event discarded" container=5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232 type=CONTAINER_CREATED_EVENT Jan 15 00:28:39.075064 containerd[1703]: time="2026-01-15T00:28:39.075037341Z" level=info msg="container event discarded" container=5fc7af651695d5a3603f8ba665d6473f4549ff7c431f66a75b9aaec126ebf232 type=CONTAINER_STARTED_EVENT Jan 15 00:28:39.156314 containerd[1703]: time="2026-01-15T00:28:39.156237070Z" level=info msg="container event discarded" container=228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291 type=CONTAINER_CREATED_EVENT Jan 15 00:28:39.156314 containerd[1703]: time="2026-01-15T00:28:39.156286870Z" level=info msg="container event discarded" container=228d97f8faae9c6b353e240bfe654645e43b38063122f96552c98182db69b291 type=CONTAINER_STARTED_EVENT Jan 15 00:28:40.105922 containerd[1703]: time="2026-01-15T00:28:40.105823854Z" level=info msg="container event discarded" container=100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13 type=CONTAINER_CREATED_EVENT Jan 15 00:28:40.105922 containerd[1703]: time="2026-01-15T00:28:40.105892774Z" level=info msg="container event discarded" container=100bcb6338a99a8acfcabf136b2355db195a23937c0c26303cbe3107fc5fca13 type=CONTAINER_STARTED_EVENT Jan 15 00:28:40.873652 kubelet[2931]: E0115 00:28:40.873242 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:28:41.143105 containerd[1703]: time="2026-01-15T00:28:41.142868146Z" level=info msg="container event discarded" container=188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9 type=CONTAINER_CREATED_EVENT Jan 15 00:28:41.143105 containerd[1703]: time="2026-01-15T00:28:41.142918146Z" level=info msg="container event discarded" container=188eb2c0bcd0ca036b0c0e472525b87e689be3ec1ffc66f1560c4043a2d3bab9 type=CONTAINER_STARTED_EVENT Jan 15 00:28:41.265695 containerd[1703]: time="2026-01-15T00:28:41.265429241Z" level=info msg="container event discarded" container=d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932 type=CONTAINER_CREATED_EVENT Jan 15 00:28:41.265695 containerd[1703]: time="2026-01-15T00:28:41.265481441Z" level=info msg="container event discarded" container=d77d40f0c54df5f56894c3c589ddd7f609dd36a053633fbf513a481a8b204932 type=CONTAINER_STARTED_EVENT Jan 15 00:28:41.283787 containerd[1703]: time="2026-01-15T00:28:41.283724177Z" level=info msg="container event discarded" container=453934082c8179e198c87b5fcbf94a618256a2ce3676e0c6be4eaad9aae87331 type=CONTAINER_CREATED_EVENT Jan 15 00:28:41.341165 containerd[1703]: time="2026-01-15T00:28:41.341078752Z" level=info msg="container event discarded" container=453934082c8179e198c87b5fcbf94a618256a2ce3676e0c6be4eaad9aae87331 type=CONTAINER_STARTED_EVENT Jan 15 00:28:42.228578 containerd[1703]: time="2026-01-15T00:28:42.228446627Z" level=info msg="container event discarded" container=adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb type=CONTAINER_CREATED_EVENT Jan 15 00:28:42.228578 containerd[1703]: time="2026-01-15T00:28:42.228532027Z" level=info msg="container event discarded" container=adf4d91c85e20c67f2f94c85880e639dcb49846071f55e2762e1c33d92bffcfb type=CONTAINER_STARTED_EVENT Jan 15 00:28:43.120652 containerd[1703]: time="2026-01-15T00:28:43.120558395Z" level=info msg="container event discarded" container=d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422 type=CONTAINER_CREATED_EVENT Jan 15 00:28:43.120652 containerd[1703]: time="2026-01-15T00:28:43.120607075Z" level=info msg="container event discarded" container=d94e51ca3096bc97fc0b5d0dcfc3f7eda2c02d7bad6f22a0375291aaf7097422 type=CONTAINER_STARTED_EVENT Jan 15 00:28:43.152018 containerd[1703]: time="2026-01-15T00:28:43.151925331Z" level=info msg="container event discarded" container=7906d588f8527998ccd1237ec7dbe84cd824125d3e4cc789dc60cae3a2f4ec84 type=CONTAINER_CREATED_EVENT Jan 15 00:28:43.202481 containerd[1703]: time="2026-01-15T00:28:43.202293925Z" level=info msg="container event discarded" container=7906d588f8527998ccd1237ec7dbe84cd824125d3e4cc789dc60cae3a2f4ec84 type=CONTAINER_STARTED_EVENT Jan 15 00:28:43.874204 kubelet[2931]: E0115 00:28:43.874145 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:28:44.111213 containerd[1703]: time="2026-01-15T00:28:44.111143065Z" level=info msg="container event discarded" container=c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1 type=CONTAINER_CREATED_EVENT Jan 15 00:28:44.111213 containerd[1703]: time="2026-01-15T00:28:44.111201345Z" level=info msg="container event discarded" container=c1560968124df53d630f7769099e6f3d2bc131a35dcd782818b0981c3bfbdaf1 type=CONTAINER_STARTED_EVENT Jan 15 00:28:44.873710 kubelet[2931]: E0115 00:28:44.873659 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:28:46.876223 kubelet[2931]: E0115 00:28:46.876008 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:28:47.339463 update_engine[1682]: I20260115 00:28:47.339387 1682 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 00:28:47.339822 update_engine[1682]: I20260115 00:28:47.339478 1682 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 00:28:47.339850 update_engine[1682]: I20260115 00:28:47.339819 1682 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 00:28:47.346499 update_engine[1682]: E20260115 00:28:47.346446 1682 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 00:28:47.346633 update_engine[1682]: I20260115 00:28:47.346531 1682 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 15 00:28:47.346633 update_engine[1682]: I20260115 00:28:47.346540 1682 omaha_request_action.cc:617] Omaha request response: Jan 15 00:28:47.346633 update_engine[1682]: E20260115 00:28:47.346619 1682 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 15 00:28:47.346699 update_engine[1682]: I20260115 00:28:47.346637 1682 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 15 00:28:47.346699 update_engine[1682]: I20260115 00:28:47.346642 1682 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 15 00:28:47.346699 update_engine[1682]: I20260115 00:28:47.346647 1682 update_attempter.cc:306] Processing Done. Jan 15 00:28:47.346699 update_engine[1682]: E20260115 00:28:47.346660 1682 update_attempter.cc:619] Update failed. Jan 15 00:28:47.346699 update_engine[1682]: I20260115 00:28:47.346664 1682 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 15 00:28:47.346699 update_engine[1682]: I20260115 00:28:47.346669 1682 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 15 00:28:47.346699 update_engine[1682]: I20260115 00:28:47.346673 1682 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 15 00:28:47.346846 update_engine[1682]: I20260115 00:28:47.346740 1682 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 15 00:28:47.346846 update_engine[1682]: I20260115 00:28:47.346760 1682 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 15 00:28:47.346846 update_engine[1682]: I20260115 00:28:47.346766 1682 omaha_request_action.cc:272] Request: Jan 15 00:28:47.346846 update_engine[1682]: Jan 15 00:28:47.346846 update_engine[1682]: Jan 15 00:28:47.346846 update_engine[1682]: Jan 15 00:28:47.346846 update_engine[1682]: Jan 15 00:28:47.346846 update_engine[1682]: Jan 15 00:28:47.346846 update_engine[1682]: Jan 15 00:28:47.346846 update_engine[1682]: I20260115 00:28:47.346772 1682 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 15 00:28:47.346846 update_engine[1682]: I20260115 00:28:47.346791 1682 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 15 00:28:47.347367 update_engine[1682]: I20260115 00:28:47.347126 1682 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 15 00:28:47.347427 locksmithd[1735]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 15 00:28:47.354885 update_engine[1682]: E20260115 00:28:47.354830 1682 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 15 00:28:47.354963 update_engine[1682]: I20260115 00:28:47.354923 1682 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 15 00:28:47.354963 update_engine[1682]: I20260115 00:28:47.354933 1682 omaha_request_action.cc:617] Omaha request response: Jan 15 00:28:47.354963 update_engine[1682]: I20260115 00:28:47.354940 1682 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 15 00:28:47.354963 update_engine[1682]: I20260115 00:28:47.354944 1682 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 15 00:28:47.354963 update_engine[1682]: I20260115 00:28:47.354948 1682 update_attempter.cc:306] Processing Done. Jan 15 00:28:47.354963 update_engine[1682]: I20260115 00:28:47.354954 1682 update_attempter.cc:310] Error event sent. Jan 15 00:28:47.354963 update_engine[1682]: I20260115 00:28:47.354961 1682 update_check_scheduler.cc:74] Next update check in 40m53s Jan 15 00:28:47.355665 locksmithd[1735]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 15 00:28:49.872344 kubelet[2931]: E0115 00:28:49.872300 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:28:50.876214 kubelet[2931]: E0115 00:28:50.873649 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:28:51.876674 kubelet[2931]: E0115 00:28:51.876615 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:28:56.874989 kubelet[2931]: E0115 00:28:56.874932 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:28:57.873379 kubelet[2931]: E0115 00:28:57.873320 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:29:00.874007 kubelet[2931]: E0115 00:29:00.873882 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:29:00.874007 kubelet[2931]: E0115 00:29:00.873925 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:29:03.873682 kubelet[2931]: E0115 00:29:03.873543 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:29:04.873199 kubelet[2931]: E0115 00:29:04.873103 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:29:05.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-10.0.3.29:22-20.161.92.111:57106 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:05.413077 systemd[1]: Started sshd@31-10.0.3.29:22-20.161.92.111:57106.service - OpenSSH per-connection server daemon (20.161.92.111:57106). Jan 15 00:29:05.417190 kernel: audit: type=1130 audit(1768436945.411:822): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-10.0.3.29:22-20.161.92.111:57106 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:05.960000 audit[5656]: USER_ACCT pid=5656 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:05.962217 sshd[5656]: Accepted publickey for core from 20.161.92.111 port 57106 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:05.963000 audit[5656]: CRED_ACQ pid=5656 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:05.965731 sshd-session[5656]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:05.971093 kernel: audit: type=1101 audit(1768436945.960:823): pid=5656 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:05.971189 kernel: audit: type=1103 audit(1768436945.963:824): pid=5656 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:05.971223 kernel: audit: type=1006 audit(1768436945.964:825): pid=5656 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 15 00:29:05.964000 audit[5656]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd77d6e0 a2=3 a3=0 items=0 ppid=1 pid=5656 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:05.977667 kernel: audit: type=1300 audit(1768436945.964:825): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd77d6e0 a2=3 a3=0 items=0 ppid=1 pid=5656 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:05.964000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:05.979388 kernel: audit: type=1327 audit(1768436945.964:825): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:05.981241 systemd-logind[1681]: New session 10 of user core. Jan 15 00:29:05.992549 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 15 00:29:05.993000 audit[5656]: USER_START pid=5656 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:05.998000 audit[5659]: CRED_ACQ pid=5659 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:06.000227 kernel: audit: type=1105 audit(1768436945.993:826): pid=5656 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:06.005211 kernel: audit: type=1103 audit(1768436945.998:827): pid=5659 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:06.330423 sshd[5659]: Connection closed by 20.161.92.111 port 57106 Jan 15 00:29:06.331379 sshd-session[5656]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:06.331000 audit[5656]: USER_END pid=5656 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:06.336567 systemd[1]: session-10.scope: Deactivated successfully. Jan 15 00:29:06.331000 audit[5656]: CRED_DISP pid=5656 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:06.339498 systemd[1]: sshd@31-10.0.3.29:22-20.161.92.111:57106.service: Deactivated successfully. Jan 15 00:29:06.340967 kernel: audit: type=1106 audit(1768436946.331:828): pid=5656 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:06.341036 kernel: audit: type=1104 audit(1768436946.331:829): pid=5656 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:06.341032 systemd-logind[1681]: Session 10 logged out. Waiting for processes to exit. Jan 15 00:29:06.338000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-10.0.3.29:22-20.161.92.111:57106 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:06.345119 systemd-logind[1681]: Removed session 10. Jan 15 00:29:07.873772 kubelet[2931]: E0115 00:29:07.873548 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:29:11.437015 systemd[1]: Started sshd@32-10.0.3.29:22-20.161.92.111:57114.service - OpenSSH per-connection server daemon (20.161.92.111:57114). Jan 15 00:29:11.435000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-10.0.3.29:22-20.161.92.111:57114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:11.438036 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 00:29:11.438100 kernel: audit: type=1130 audit(1768436951.435:831): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-10.0.3.29:22-20.161.92.111:57114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:11.872873 kubelet[2931]: E0115 00:29:11.872804 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:29:11.963000 audit[5699]: USER_ACCT pid=5699 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:11.965360 sshd[5699]: Accepted publickey for core from 20.161.92.111 port 57114 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:11.968610 sshd-session[5699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:11.967000 audit[5699]: CRED_ACQ pid=5699 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:11.973596 kernel: audit: type=1101 audit(1768436951.963:832): pid=5699 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:11.973692 kernel: audit: type=1103 audit(1768436951.967:833): pid=5699 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:11.976589 kernel: audit: type=1006 audit(1768436951.967:834): pid=5699 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 15 00:29:11.967000 audit[5699]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffded98100 a2=3 a3=0 items=0 ppid=1 pid=5699 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:11.980392 kernel: audit: type=1300 audit(1768436951.967:834): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffded98100 a2=3 a3=0 items=0 ppid=1 pid=5699 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:11.967000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:11.982154 kernel: audit: type=1327 audit(1768436951.967:834): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:11.984952 systemd-logind[1681]: New session 11 of user core. Jan 15 00:29:11.992410 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 15 00:29:11.993000 audit[5699]: USER_START pid=5699 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:11.995000 audit[5702]: CRED_ACQ pid=5702 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:12.002053 kernel: audit: type=1105 audit(1768436951.993:835): pid=5699 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:12.002156 kernel: audit: type=1103 audit(1768436951.995:836): pid=5702 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:12.319955 sshd[5702]: Connection closed by 20.161.92.111 port 57114 Jan 15 00:29:12.320550 sshd-session[5699]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:12.320000 audit[5699]: USER_END pid=5699 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:12.323774 systemd[1]: sshd@32-10.0.3.29:22-20.161.92.111:57114.service: Deactivated successfully. Jan 15 00:29:12.325544 systemd[1]: session-11.scope: Deactivated successfully. Jan 15 00:29:12.320000 audit[5699]: CRED_DISP pid=5699 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:12.327837 systemd-logind[1681]: Session 11 logged out. Waiting for processes to exit. Jan 15 00:29:12.328652 systemd-logind[1681]: Removed session 11. Jan 15 00:29:12.328811 kernel: audit: type=1106 audit(1768436952.320:837): pid=5699 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:12.328865 kernel: audit: type=1104 audit(1768436952.320:838): pid=5699 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:12.320000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-10.0.3.29:22-20.161.92.111:57114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:12.874498 kubelet[2931]: E0115 00:29:12.874421 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:29:14.872848 kubelet[2931]: E0115 00:29:14.872771 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:29:16.873223 containerd[1703]: time="2026-01-15T00:29:16.873151354Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:29:16.874080 kubelet[2931]: E0115 00:29:16.874040 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:29:17.404220 containerd[1703]: time="2026-01-15T00:29:17.404154898Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:29:17.405827 containerd[1703]: time="2026-01-15T00:29:17.405713103Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:29:17.405827 containerd[1703]: time="2026-01-15T00:29:17.405789303Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:29:17.406026 kubelet[2931]: E0115 00:29:17.405987 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:29:17.406087 kubelet[2931]: E0115 00:29:17.406036 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:29:17.406288 kubelet[2931]: E0115 00:29:17.406206 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5q7b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-v4vh7_calico-apiserver(782417a5-ecd0-40c5-85c0-45ead5d347fd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:29:17.409464 kubelet[2931]: E0115 00:29:17.409349 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:29:17.430000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-10.0.3.29:22-20.161.92.111:49938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:17.431598 systemd[1]: Started sshd@33-10.0.3.29:22-20.161.92.111:49938.service - OpenSSH per-connection server daemon (20.161.92.111:49938). Jan 15 00:29:17.432458 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 00:29:17.432492 kernel: audit: type=1130 audit(1768436957.430:840): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-10.0.3.29:22-20.161.92.111:49938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:17.950000 audit[5719]: USER_ACCT pid=5719 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:17.951767 sshd[5719]: Accepted publickey for core from 20.161.92.111 port 49938 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:17.955234 kernel: audit: type=1101 audit(1768436957.950:841): pid=5719 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:17.954000 audit[5719]: CRED_ACQ pid=5719 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:17.956149 sshd-session[5719]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:17.961264 kernel: audit: type=1103 audit(1768436957.954:842): pid=5719 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:17.961331 kernel: audit: type=1006 audit(1768436957.954:843): pid=5719 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 15 00:29:17.961357 kernel: audit: type=1300 audit(1768436957.954:843): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff76838e0 a2=3 a3=0 items=0 ppid=1 pid=5719 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:17.954000 audit[5719]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff76838e0 a2=3 a3=0 items=0 ppid=1 pid=5719 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:17.960985 systemd-logind[1681]: New session 12 of user core. Jan 15 00:29:17.954000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:17.965669 kernel: audit: type=1327 audit(1768436957.954:843): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:17.968391 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 15 00:29:17.971000 audit[5719]: USER_START pid=5719 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:17.973000 audit[5722]: CRED_ACQ pid=5722 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:17.980042 kernel: audit: type=1105 audit(1768436957.971:844): pid=5719 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:17.980134 kernel: audit: type=1103 audit(1768436957.973:845): pid=5722 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:18.316988 sshd[5722]: Connection closed by 20.161.92.111 port 49938 Jan 15 00:29:18.318079 sshd-session[5719]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:18.318000 audit[5719]: USER_END pid=5719 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:18.318000 audit[5719]: CRED_DISP pid=5719 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:18.323597 systemd[1]: sshd@33-10.0.3.29:22-20.161.92.111:49938.service: Deactivated successfully. Jan 15 00:29:18.325422 systemd[1]: session-12.scope: Deactivated successfully. Jan 15 00:29:18.327217 kernel: audit: type=1106 audit(1768436958.318:846): pid=5719 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:18.327290 kernel: audit: type=1104 audit(1768436958.318:847): pid=5719 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:18.322000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-10.0.3.29:22-20.161.92.111:49938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:18.328431 systemd-logind[1681]: Session 12 logged out. Waiting for processes to exit. Jan 15 00:29:18.330243 systemd-logind[1681]: Removed session 12. Jan 15 00:29:18.432392 systemd[1]: Started sshd@34-10.0.3.29:22-20.161.92.111:49950.service - OpenSSH per-connection server daemon (20.161.92.111:49950). Jan 15 00:29:18.431000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-10.0.3.29:22-20.161.92.111:49950 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:18.959000 audit[5736]: USER_ACCT pid=5736 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:18.960931 sshd[5736]: Accepted publickey for core from 20.161.92.111 port 49950 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:18.960000 audit[5736]: CRED_ACQ pid=5736 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:18.960000 audit[5736]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3979780 a2=3 a3=0 items=0 ppid=1 pid=5736 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:18.960000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:18.962107 sshd-session[5736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:18.967235 systemd-logind[1681]: New session 13 of user core. Jan 15 00:29:18.978548 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 15 00:29:18.979000 audit[5736]: USER_START pid=5736 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:18.981000 audit[5739]: CRED_ACQ pid=5739 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:19.344102 sshd[5739]: Connection closed by 20.161.92.111 port 49950 Jan 15 00:29:19.344806 sshd-session[5736]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:19.345000 audit[5736]: USER_END pid=5736 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:19.345000 audit[5736]: CRED_DISP pid=5736 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:19.349943 systemd[1]: sshd@34-10.0.3.29:22-20.161.92.111:49950.service: Deactivated successfully. Jan 15 00:29:19.349000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-10.0.3.29:22-20.161.92.111:49950 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:19.351770 systemd[1]: session-13.scope: Deactivated successfully. Jan 15 00:29:19.352582 systemd-logind[1681]: Session 13 logged out. Waiting for processes to exit. Jan 15 00:29:19.353992 systemd-logind[1681]: Removed session 13. Jan 15 00:29:19.450000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-10.0.3.29:22-20.161.92.111:49952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:19.451561 systemd[1]: Started sshd@35-10.0.3.29:22-20.161.92.111:49952.service - OpenSSH per-connection server daemon (20.161.92.111:49952). Jan 15 00:29:19.971000 audit[5753]: USER_ACCT pid=5753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:19.972844 sshd[5753]: Accepted publickey for core from 20.161.92.111 port 49952 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:19.973000 audit[5753]: CRED_ACQ pid=5753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:19.973000 audit[5753]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff6a9cea0 a2=3 a3=0 items=0 ppid=1 pid=5753 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:19.973000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:19.974720 sshd-session[5753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:19.980031 systemd-logind[1681]: New session 14 of user core. Jan 15 00:29:19.989407 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 15 00:29:19.990000 audit[5753]: USER_START pid=5753 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:19.992000 audit[5756]: CRED_ACQ pid=5756 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:20.326857 sshd[5756]: Connection closed by 20.161.92.111 port 49952 Jan 15 00:29:20.327459 sshd-session[5753]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:20.327000 audit[5753]: USER_END pid=5753 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:20.327000 audit[5753]: CRED_DISP pid=5753 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:20.330000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-10.0.3.29:22-20.161.92.111:49952 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:20.330999 systemd[1]: sshd@35-10.0.3.29:22-20.161.92.111:49952.service: Deactivated successfully. Jan 15 00:29:20.332829 systemd[1]: session-14.scope: Deactivated successfully. Jan 15 00:29:20.334794 systemd-logind[1681]: Session 14 logged out. Waiting for processes to exit. Jan 15 00:29:20.335638 systemd-logind[1681]: Removed session 14. Jan 15 00:29:21.872836 containerd[1703]: time="2026-01-15T00:29:21.872797326Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 00:29:22.205538 containerd[1703]: time="2026-01-15T00:29:22.205323663Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:29:22.206753 containerd[1703]: time="2026-01-15T00:29:22.206715788Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 00:29:22.206847 containerd[1703]: time="2026-01-15T00:29:22.206799108Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 00:29:22.208271 kubelet[2931]: E0115 00:29:22.208205 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:29:22.208271 kubelet[2931]: E0115 00:29:22.208266 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 00:29:22.208760 kubelet[2931]: E0115 00:29:22.208380 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:956fb8f8794a4941a07f54797f4e6220,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 00:29:22.210816 containerd[1703]: time="2026-01-15T00:29:22.210792400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 00:29:22.599112 containerd[1703]: time="2026-01-15T00:29:22.599066548Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:29:22.600415 containerd[1703]: time="2026-01-15T00:29:22.600325752Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 00:29:22.600415 containerd[1703]: time="2026-01-15T00:29:22.600362752Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 00:29:22.600578 kubelet[2931]: E0115 00:29:22.600532 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:29:22.600674 kubelet[2931]: E0115 00:29:22.600595 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 00:29:22.600904 kubelet[2931]: E0115 00:29:22.600862 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-955f9fbff-rtvzr_calico-system(14a4f6b6-e857-4a63-b075-14b068610222): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 00:29:22.602090 kubelet[2931]: E0115 00:29:22.602040 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:29:25.438000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-10.0.3.29:22-20.161.92.111:38448 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:25.439102 systemd[1]: Started sshd@36-10.0.3.29:22-20.161.92.111:38448.service - OpenSSH per-connection server daemon (20.161.92.111:38448). Jan 15 00:29:25.442700 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 15 00:29:25.442750 kernel: audit: type=1130 audit(1768436965.438:867): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-10.0.3.29:22-20.161.92.111:38448 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:25.873422 containerd[1703]: time="2026-01-15T00:29:25.873360643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 00:29:25.960000 audit[5770]: USER_ACCT pid=5770 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:25.961821 sshd[5770]: Accepted publickey for core from 20.161.92.111 port 38448 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:25.965207 kernel: audit: type=1101 audit(1768436965.960:868): pid=5770 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:25.964000 audit[5770]: CRED_ACQ pid=5770 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:25.965979 sshd-session[5770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:25.970636 kernel: audit: type=1103 audit(1768436965.964:869): pid=5770 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:25.970735 kernel: audit: type=1006 audit(1768436965.964:870): pid=5770 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Jan 15 00:29:25.970784 kernel: audit: type=1300 audit(1768436965.964:870): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd9bf6940 a2=3 a3=0 items=0 ppid=1 pid=5770 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:25.964000 audit[5770]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd9bf6940 a2=3 a3=0 items=0 ppid=1 pid=5770 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:25.972887 systemd-logind[1681]: New session 15 of user core. Jan 15 00:29:25.964000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:25.975408 kernel: audit: type=1327 audit(1768436965.964:870): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:25.986528 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 15 00:29:25.988000 audit[5770]: USER_START pid=5770 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:25.989000 audit[5773]: CRED_ACQ pid=5773 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:25.996027 kernel: audit: type=1105 audit(1768436965.988:871): pid=5770 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:25.996101 kernel: audit: type=1103 audit(1768436965.989:872): pid=5773 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:26.203108 containerd[1703]: time="2026-01-15T00:29:26.202978451Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:29:26.205137 containerd[1703]: time="2026-01-15T00:29:26.205019937Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 00:29:26.205137 containerd[1703]: time="2026-01-15T00:29:26.205095977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 00:29:26.205278 kubelet[2931]: E0115 00:29:26.205234 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:29:26.205681 kubelet[2931]: E0115 00:29:26.205294 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 00:29:26.205681 kubelet[2931]: E0115 00:29:26.205406 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-568wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7877d5fb5-c885v_calico-system(051b417e-bac4-4f72-8b07-3775d126567f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 00:29:26.206791 kubelet[2931]: E0115 00:29:26.206751 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:29:26.345263 sshd[5773]: Connection closed by 20.161.92.111 port 38448 Jan 15 00:29:26.345421 sshd-session[5770]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:26.346000 audit[5770]: USER_END pid=5770 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:26.346000 audit[5770]: CRED_DISP pid=5770 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:26.351735 systemd[1]: sshd@36-10.0.3.29:22-20.161.92.111:38448.service: Deactivated successfully. Jan 15 00:29:26.354903 kernel: audit: type=1106 audit(1768436966.346:873): pid=5770 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:26.355031 kernel: audit: type=1104 audit(1768436966.346:874): pid=5770 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:26.350000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-10.0.3.29:22-20.161.92.111:38448 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:26.355279 systemd[1]: session-15.scope: Deactivated successfully. Jan 15 00:29:26.356349 systemd-logind[1681]: Session 15 logged out. Waiting for processes to exit. Jan 15 00:29:26.358032 systemd-logind[1681]: Removed session 15. Jan 15 00:29:26.875014 containerd[1703]: time="2026-01-15T00:29:26.874932946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 00:29:27.216390 containerd[1703]: time="2026-01-15T00:29:27.216268590Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:29:27.220831 containerd[1703]: time="2026-01-15T00:29:27.220781724Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 00:29:27.221054 containerd[1703]: time="2026-01-15T00:29:27.220849884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 00:29:27.221093 kubelet[2931]: E0115 00:29:27.221020 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:29:27.221093 kubelet[2931]: E0115 00:29:27.221069 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 00:29:27.221394 kubelet[2931]: E0115 00:29:27.221191 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xck2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-d7bdbd78b-nn9k9_calico-apiserver(933e7fe5-e25e-48cf-938a-716b1fa3d838): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 00:29:27.222552 kubelet[2931]: E0115 00:29:27.222514 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:29:27.874037 containerd[1703]: time="2026-01-15T00:29:27.873983482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 00:29:28.198260 containerd[1703]: time="2026-01-15T00:29:28.198131273Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:29:28.202149 containerd[1703]: time="2026-01-15T00:29:28.202075846Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 00:29:28.202299 containerd[1703]: time="2026-01-15T00:29:28.202116566Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 00:29:28.202467 kubelet[2931]: E0115 00:29:28.202375 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:29:28.202467 kubelet[2931]: E0115 00:29:28.202425 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 00:29:28.202611 kubelet[2931]: E0115 00:29:28.202527 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 00:29:28.205676 containerd[1703]: time="2026-01-15T00:29:28.205643256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 00:29:28.527130 containerd[1703]: time="2026-01-15T00:29:28.527073880Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:29:28.528561 containerd[1703]: time="2026-01-15T00:29:28.528451764Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 00:29:28.528561 containerd[1703]: time="2026-01-15T00:29:28.528505324Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 00:29:28.528832 kubelet[2931]: E0115 00:29:28.528793 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:29:28.529396 kubelet[2931]: E0115 00:29:28.529141 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 00:29:28.529396 kubelet[2931]: E0115 00:29:28.529329 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cx8sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-92nsn_calico-system(e8af8aef-db47-4bb0-9303-531f44a2593e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 00:29:28.530533 kubelet[2931]: E0115 00:29:28.530497 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:29:28.873423 kubelet[2931]: E0115 00:29:28.873306 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:29:28.875251 containerd[1703]: time="2026-01-15T00:29:28.874745063Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 00:29:29.391293 containerd[1703]: time="2026-01-15T00:29:29.391246923Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 00:29:29.392614 containerd[1703]: time="2026-01-15T00:29:29.392558887Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 00:29:29.392731 containerd[1703]: time="2026-01-15T00:29:29.392596407Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 00:29:29.392952 kubelet[2931]: E0115 00:29:29.392874 2931 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:29:29.393126 kubelet[2931]: E0115 00:29:29.393104 2931 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 00:29:29.393603 kubelet[2931]: E0115 00:29:29.393440 2931 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqmw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-cd46w_calico-system(576324d0-4c45-424a-9f35-c0de23b9b1ac): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 00:29:29.394951 kubelet[2931]: E0115 00:29:29.394913 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:29:29.576511 systemd[1]: Started sshd@37-10.0.3.29:22-146.190.22.154:34770.service - OpenSSH per-connection server daemon (146.190.22.154:34770). Jan 15 00:29:29.575000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-10.0.3.29:22-146.190.22.154:34770 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:29.912948 sshd[5790]: Connection closed by authenticating user root 146.190.22.154 port 34770 [preauth] Jan 15 00:29:29.912000 audit[5790]: USER_ERR pid=5790 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=146.190.22.154 addr=146.190.22.154 terminal=ssh res=failed' Jan 15 00:29:29.915602 systemd[1]: sshd@37-10.0.3.29:22-146.190.22.154:34770.service: Deactivated successfully. Jan 15 00:29:29.916000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-10.0.3.29:22-146.190.22.154:34770 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:31.457000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-10.0.3.29:22-20.161.92.111:38456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:31.460419 kernel: kauditd_printk_skb: 4 callbacks suppressed Jan 15 00:29:31.460453 kernel: audit: type=1130 audit(1768436971.457:879): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-10.0.3.29:22-20.161.92.111:38456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:31.458936 systemd[1]: Started sshd@38-10.0.3.29:22-20.161.92.111:38456.service - OpenSSH per-connection server daemon (20.161.92.111:38456). Jan 15 00:29:31.986668 sshd[5797]: Accepted publickey for core from 20.161.92.111 port 38456 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:31.985000 audit[5797]: USER_ACCT pid=5797 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:31.989264 sshd-session[5797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:31.987000 audit[5797]: CRED_ACQ pid=5797 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:31.995642 kernel: audit: type=1101 audit(1768436971.985:880): pid=5797 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:31.995764 kernel: audit: type=1103 audit(1768436971.987:881): pid=5797 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:31.995795 kernel: audit: type=1006 audit(1768436971.987:882): pid=5797 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 15 00:29:31.987000 audit[5797]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd89a4100 a2=3 a3=0 items=0 ppid=1 pid=5797 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:32.001656 kernel: audit: type=1300 audit(1768436971.987:882): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd89a4100 a2=3 a3=0 items=0 ppid=1 pid=5797 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:31.987000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:32.003577 kernel: audit: type=1327 audit(1768436971.987:882): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:32.007240 systemd-logind[1681]: New session 16 of user core. Jan 15 00:29:32.015730 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 15 00:29:32.018000 audit[5797]: USER_START pid=5797 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:32.022000 audit[5800]: CRED_ACQ pid=5800 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:32.026944 kernel: audit: type=1105 audit(1768436972.018:883): pid=5797 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:32.027031 kernel: audit: type=1103 audit(1768436972.022:884): pid=5800 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:32.348020 sshd[5800]: Connection closed by 20.161.92.111 port 38456 Jan 15 00:29:32.349696 sshd-session[5797]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:32.350000 audit[5797]: USER_END pid=5797 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:32.356442 systemd[1]: sshd@38-10.0.3.29:22-20.161.92.111:38456.service: Deactivated successfully. Jan 15 00:29:32.359989 kernel: audit: type=1106 audit(1768436972.350:885): pid=5797 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:32.360068 kernel: audit: type=1104 audit(1768436972.350:886): pid=5797 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:32.350000 audit[5797]: CRED_DISP pid=5797 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:32.358923 systemd[1]: session-16.scope: Deactivated successfully. Jan 15 00:29:32.360049 systemd-logind[1681]: Session 16 logged out. Waiting for processes to exit. Jan 15 00:29:32.355000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-10.0.3.29:22-20.161.92.111:38456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:32.365403 systemd-logind[1681]: Removed session 16. Jan 15 00:29:33.873592 kubelet[2931]: E0115 00:29:33.873500 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:29:37.459942 systemd[1]: Started sshd@39-10.0.3.29:22-20.161.92.111:36512.service - OpenSSH per-connection server daemon (20.161.92.111:36512). Jan 15 00:29:37.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-10.0.3.29:22-20.161.92.111:36512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:37.464538 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 00:29:37.464654 kernel: audit: type=1130 audit(1768436977.459:888): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-10.0.3.29:22-20.161.92.111:36512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:38.001000 audit[5813]: USER_ACCT pid=5813 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.005695 sshd[5813]: Accepted publickey for core from 20.161.92.111 port 36512 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:38.006241 sshd-session[5813]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:38.004000 audit[5813]: CRED_ACQ pid=5813 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.010584 kernel: audit: type=1101 audit(1768436978.001:889): pid=5813 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.010653 kernel: audit: type=1103 audit(1768436978.004:890): pid=5813 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.012884 kernel: audit: type=1006 audit(1768436978.004:891): pid=5813 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 15 00:29:38.012944 kernel: audit: type=1300 audit(1768436978.004:891): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff19719f0 a2=3 a3=0 items=0 ppid=1 pid=5813 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:38.004000 audit[5813]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff19719f0 a2=3 a3=0 items=0 ppid=1 pid=5813 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:38.016798 systemd-logind[1681]: New session 17 of user core. Jan 15 00:29:38.017153 kernel: audit: type=1327 audit(1768436978.004:891): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:38.004000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:38.021362 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 15 00:29:38.022000 audit[5813]: USER_START pid=5813 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.027225 kernel: audit: type=1105 audit(1768436978.022:892): pid=5813 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.026000 audit[5816]: CRED_ACQ pid=5816 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.031203 kernel: audit: type=1103 audit(1768436978.026:893): pid=5816 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.391028 sshd[5816]: Connection closed by 20.161.92.111 port 36512 Jan 15 00:29:38.391372 sshd-session[5813]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:38.392000 audit[5813]: USER_END pid=5813 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.396038 systemd[1]: sshd@39-10.0.3.29:22-20.161.92.111:36512.service: Deactivated successfully. Jan 15 00:29:38.392000 audit[5813]: CRED_DISP pid=5813 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.399463 systemd[1]: session-17.scope: Deactivated successfully. Jan 15 00:29:38.400343 systemd-logind[1681]: Session 17 logged out. Waiting for processes to exit. Jan 15 00:29:38.401238 kernel: audit: type=1106 audit(1768436978.392:894): pid=5813 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.401677 kernel: audit: type=1104 audit(1768436978.392:895): pid=5813 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:38.395000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-10.0.3.29:22-20.161.92.111:36512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:38.402587 systemd-logind[1681]: Removed session 17. Jan 15 00:29:38.503472 systemd[1]: Started sshd@40-10.0.3.29:22-20.161.92.111:36516.service - OpenSSH per-connection server daemon (20.161.92.111:36516). Jan 15 00:29:38.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-10.0.3.29:22-20.161.92.111:36516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:38.873706 kubelet[2931]: E0115 00:29:38.873643 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:29:39.035000 audit[5856]: USER_ACCT pid=5856 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:39.036712 sshd[5856]: Accepted publickey for core from 20.161.92.111 port 36516 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:39.037000 audit[5856]: CRED_ACQ pid=5856 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:39.037000 audit[5856]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdb1ee180 a2=3 a3=0 items=0 ppid=1 pid=5856 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:39.037000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:39.038709 sshd-session[5856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:39.043237 systemd-logind[1681]: New session 18 of user core. Jan 15 00:29:39.054594 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 15 00:29:39.056000 audit[5856]: USER_START pid=5856 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:39.057000 audit[5861]: CRED_ACQ pid=5861 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:39.444211 sshd[5861]: Connection closed by 20.161.92.111 port 36516 Jan 15 00:29:39.445243 sshd-session[5856]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:39.445000 audit[5856]: USER_END pid=5856 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:39.445000 audit[5856]: CRED_DISP pid=5856 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:39.450278 systemd[1]: sshd@40-10.0.3.29:22-20.161.92.111:36516.service: Deactivated successfully. Jan 15 00:29:39.450000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-10.0.3.29:22-20.161.92.111:36516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:39.453705 systemd[1]: session-18.scope: Deactivated successfully. Jan 15 00:29:39.456606 systemd-logind[1681]: Session 18 logged out. Waiting for processes to exit. Jan 15 00:29:39.457524 systemd-logind[1681]: Removed session 18. Jan 15 00:29:39.554000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-10.0.3.29:22-20.161.92.111:36528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:39.555832 systemd[1]: Started sshd@41-10.0.3.29:22-20.161.92.111:36528.service - OpenSSH per-connection server daemon (20.161.92.111:36528). Jan 15 00:29:39.873533 kubelet[2931]: E0115 00:29:39.873454 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:29:40.083000 audit[5872]: USER_ACCT pid=5872 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:40.084799 sshd[5872]: Accepted publickey for core from 20.161.92.111 port 36528 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:40.084000 audit[5872]: CRED_ACQ pid=5872 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:40.084000 audit[5872]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff0098960 a2=3 a3=0 items=0 ppid=1 pid=5872 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:40.084000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:40.086403 sshd-session[5872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:40.091121 systemd-logind[1681]: New session 19 of user core. Jan 15 00:29:40.101383 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 15 00:29:40.102000 audit[5872]: USER_START pid=5872 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:40.105000 audit[5875]: CRED_ACQ pid=5875 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:40.873297 kubelet[2931]: E0115 00:29:40.873090 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:29:40.903000 audit[5886]: NETFILTER_CFG table=filter:147 family=2 entries=26 op=nft_register_rule pid=5886 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:29:40.903000 audit[5886]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffdb57c1d0 a2=0 a3=1 items=0 ppid=3036 pid=5886 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:40.903000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:29:40.918000 audit[5886]: NETFILTER_CFG table=nat:148 family=2 entries=20 op=nft_register_rule pid=5886 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:29:40.918000 audit[5886]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffdb57c1d0 a2=0 a3=1 items=0 ppid=3036 pid=5886 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:40.918000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:29:40.934000 audit[5888]: NETFILTER_CFG table=filter:149 family=2 entries=38 op=nft_register_rule pid=5888 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:29:40.934000 audit[5888]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffc7152980 a2=0 a3=1 items=0 ppid=3036 pid=5888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:40.934000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:29:40.944000 audit[5888]: NETFILTER_CFG table=nat:150 family=2 entries=20 op=nft_register_rule pid=5888 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:29:40.944000 audit[5888]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc7152980 a2=0 a3=1 items=0 ppid=3036 pid=5888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:40.944000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:29:40.995161 sshd[5875]: Connection closed by 20.161.92.111 port 36528 Jan 15 00:29:40.995897 sshd-session[5872]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:40.995000 audit[5872]: USER_END pid=5872 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:40.996000 audit[5872]: CRED_DISP pid=5872 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:40.999951 systemd[1]: sshd@41-10.0.3.29:22-20.161.92.111:36528.service: Deactivated successfully. Jan 15 00:29:40.999000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-10.0.3.29:22-20.161.92.111:36528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:41.001845 systemd[1]: session-19.scope: Deactivated successfully. Jan 15 00:29:41.002687 systemd-logind[1681]: Session 19 logged out. Waiting for processes to exit. Jan 15 00:29:41.003780 systemd-logind[1681]: Removed session 19. Jan 15 00:29:41.100000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-10.0.3.29:22-20.161.92.111:36544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:41.101710 systemd[1]: Started sshd@42-10.0.3.29:22-20.161.92.111:36544.service - OpenSSH per-connection server daemon (20.161.92.111:36544). Jan 15 00:29:41.628000 audit[5893]: USER_ACCT pid=5893 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:41.630428 sshd[5893]: Accepted publickey for core from 20.161.92.111 port 36544 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:41.631000 audit[5893]: CRED_ACQ pid=5893 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:41.633534 sshd-session[5893]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:41.631000 audit[5893]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcac34c30 a2=3 a3=0 items=0 ppid=1 pid=5893 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:41.631000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:41.640883 systemd-logind[1681]: New session 20 of user core. Jan 15 00:29:41.647603 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 15 00:29:41.652000 audit[5893]: USER_START pid=5893 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:41.654000 audit[5896]: CRED_ACQ pid=5896 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:41.875421 kubelet[2931]: E0115 00:29:41.875328 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:29:42.125786 sshd[5896]: Connection closed by 20.161.92.111 port 36544 Jan 15 00:29:42.126117 sshd-session[5893]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:42.126000 audit[5893]: USER_END pid=5893 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:42.126000 audit[5893]: CRED_DISP pid=5893 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:42.130538 systemd-logind[1681]: Session 20 logged out. Waiting for processes to exit. Jan 15 00:29:42.130767 systemd[1]: sshd@42-10.0.3.29:22-20.161.92.111:36544.service: Deactivated successfully. Jan 15 00:29:42.131000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-10.0.3.29:22-20.161.92.111:36544 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:42.134119 systemd[1]: session-20.scope: Deactivated successfully. Jan 15 00:29:42.137381 systemd-logind[1681]: Removed session 20. Jan 15 00:29:42.235000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-10.0.3.29:22-20.161.92.111:36556 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:42.236886 systemd[1]: Started sshd@43-10.0.3.29:22-20.161.92.111:36556.service - OpenSSH per-connection server daemon (20.161.92.111:36556). Jan 15 00:29:42.768000 audit[5908]: USER_ACCT pid=5908 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:42.769803 sshd[5908]: Accepted publickey for core from 20.161.92.111 port 36556 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:42.770839 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 15 00:29:42.770922 kernel: audit: type=1101 audit(1768436982.768:929): pid=5908 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:42.773000 audit[5908]: CRED_ACQ pid=5908 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:42.775278 sshd-session[5908]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:42.778155 kernel: audit: type=1103 audit(1768436982.773:930): pid=5908 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:42.778302 kernel: audit: type=1006 audit(1768436982.773:931): pid=5908 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 15 00:29:42.773000 audit[5908]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc657d7c0 a2=3 a3=0 items=0 ppid=1 pid=5908 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:42.783851 kernel: audit: type=1300 audit(1768436982.773:931): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc657d7c0 a2=3 a3=0 items=0 ppid=1 pid=5908 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:42.773000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:42.785513 kernel: audit: type=1327 audit(1768436982.773:931): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:42.787773 systemd-logind[1681]: New session 21 of user core. Jan 15 00:29:42.798397 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 15 00:29:42.800000 audit[5908]: USER_START pid=5908 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:42.806199 kernel: audit: type=1105 audit(1768436982.800:932): pid=5908 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:42.805000 audit[5911]: CRED_ACQ pid=5911 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:42.810216 kernel: audit: type=1103 audit(1768436982.805:933): pid=5911 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:43.131558 sshd[5911]: Connection closed by 20.161.92.111 port 36556 Jan 15 00:29:43.131479 sshd-session[5908]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:43.132000 audit[5908]: USER_END pid=5908 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:43.136704 systemd[1]: sshd@43-10.0.3.29:22-20.161.92.111:36556.service: Deactivated successfully. Jan 15 00:29:43.132000 audit[5908]: CRED_DISP pid=5908 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:43.138600 systemd[1]: session-21.scope: Deactivated successfully. Jan 15 00:29:43.139357 systemd-logind[1681]: Session 21 logged out. Waiting for processes to exit. Jan 15 00:29:43.140832 systemd-logind[1681]: Removed session 21. Jan 15 00:29:43.141052 kernel: audit: type=1106 audit(1768436983.132:934): pid=5908 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:43.141109 kernel: audit: type=1104 audit(1768436983.132:935): pid=5908 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:43.135000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-10.0.3.29:22-20.161.92.111:36556 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:43.143901 kernel: audit: type=1131 audit(1768436983.135:936): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-10.0.3.29:22-20.161.92.111:36556 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:43.872931 kubelet[2931]: E0115 00:29:43.872888 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:29:45.065000 audit[5924]: NETFILTER_CFG table=filter:151 family=2 entries=26 op=nft_register_rule pid=5924 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:29:45.065000 audit[5924]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc5f57240 a2=0 a3=1 items=0 ppid=3036 pid=5924 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:45.065000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:29:45.072000 audit[5924]: NETFILTER_CFG table=nat:152 family=2 entries=104 op=nft_register_chain pid=5924 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 00:29:45.072000 audit[5924]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffc5f57240 a2=0 a3=1 items=0 ppid=3036 pid=5924 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:45.072000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 00:29:46.876249 kubelet[2931]: E0115 00:29:46.875970 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:29:48.243050 systemd[1]: Started sshd@44-10.0.3.29:22-20.161.92.111:54850.service - OpenSSH per-connection server daemon (20.161.92.111:54850). Jan 15 00:29:48.242000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-10.0.3.29:22-20.161.92.111:54850 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:48.246956 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 15 00:29:48.247030 kernel: audit: type=1130 audit(1768436988.242:939): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-10.0.3.29:22-20.161.92.111:54850 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:48.768000 audit[5928]: USER_ACCT pid=5928 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:48.769465 sshd[5928]: Accepted publickey for core from 20.161.92.111 port 54850 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:48.771000 audit[5928]: CRED_ACQ pid=5928 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:48.773412 sshd-session[5928]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:48.776182 kernel: audit: type=1101 audit(1768436988.768:940): pid=5928 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:48.776259 kernel: audit: type=1103 audit(1768436988.771:941): pid=5928 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:48.776291 kernel: audit: type=1006 audit(1768436988.771:942): pid=5928 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 15 00:29:48.771000 audit[5928]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdc919c10 a2=3 a3=0 items=0 ppid=1 pid=5928 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:48.779641 systemd-logind[1681]: New session 22 of user core. Jan 15 00:29:48.781877 kernel: audit: type=1300 audit(1768436988.771:942): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdc919c10 a2=3 a3=0 items=0 ppid=1 pid=5928 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:48.771000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:48.784396 kernel: audit: type=1327 audit(1768436988.771:942): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:48.788400 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 15 00:29:48.790000 audit[5928]: USER_START pid=5928 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:48.795193 kernel: audit: type=1105 audit(1768436988.790:943): pid=5928 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:48.795264 kernel: audit: type=1103 audit(1768436988.791:944): pid=5931 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:48.791000 audit[5931]: CRED_ACQ pid=5931 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:49.130117 sshd[5931]: Connection closed by 20.161.92.111 port 54850 Jan 15 00:29:49.130455 sshd-session[5928]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:49.130000 audit[5928]: USER_END pid=5928 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:49.134923 systemd[1]: sshd@44-10.0.3.29:22-20.161.92.111:54850.service: Deactivated successfully. Jan 15 00:29:49.130000 audit[5928]: CRED_DISP pid=5928 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:49.137570 systemd[1]: session-22.scope: Deactivated successfully. Jan 15 00:29:49.138860 kernel: audit: type=1106 audit(1768436989.130:945): pid=5928 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:49.138927 kernel: audit: type=1104 audit(1768436989.130:946): pid=5928 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:49.134000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-10.0.3.29:22-20.161.92.111:54850 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:49.139647 systemd-logind[1681]: Session 22 logged out. Waiting for processes to exit. Jan 15 00:29:49.141319 systemd-logind[1681]: Removed session 22. Jan 15 00:29:51.873408 kubelet[2931]: E0115 00:29:51.873348 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:29:53.872572 kubelet[2931]: E0115 00:29:53.872525 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:29:54.242000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-10.0.3.29:22-20.161.92.111:48516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:54.243438 systemd[1]: Started sshd@45-10.0.3.29:22-20.161.92.111:48516.service - OpenSSH per-connection server daemon (20.161.92.111:48516). Jan 15 00:29:54.244447 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 00:29:54.244497 kernel: audit: type=1130 audit(1768436994.242:948): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-10.0.3.29:22-20.161.92.111:48516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:54.771000 sshd[5964]: Accepted publickey for core from 20.161.92.111 port 48516 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:29:54.769000 audit[5964]: USER_ACCT pid=5964 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:54.773000 audit[5964]: CRED_ACQ pid=5964 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:54.775546 sshd-session[5964]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:29:54.780199 kernel: audit: type=1101 audit(1768436994.769:949): pid=5964 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:54.780279 kernel: audit: type=1103 audit(1768436994.773:950): pid=5964 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:54.780309 kernel: audit: type=1006 audit(1768436994.774:951): pid=5964 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 15 00:29:54.774000 audit[5964]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc1aae900 a2=3 a3=0 items=0 ppid=1 pid=5964 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:54.783759 kernel: audit: type=1300 audit(1768436994.774:951): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc1aae900 a2=3 a3=0 items=0 ppid=1 pid=5964 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:29:54.774000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:54.785123 kernel: audit: type=1327 audit(1768436994.774:951): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:29:54.789086 systemd-logind[1681]: New session 23 of user core. Jan 15 00:29:54.799638 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 15 00:29:54.800000 audit[5964]: USER_START pid=5964 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:54.804000 audit[5967]: CRED_ACQ pid=5967 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:54.809135 kernel: audit: type=1105 audit(1768436994.800:952): pid=5964 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:54.809212 kernel: audit: type=1103 audit(1768436994.804:953): pid=5967 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:54.873674 kubelet[2931]: E0115 00:29:54.873602 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:29:55.132713 sshd[5967]: Connection closed by 20.161.92.111 port 48516 Jan 15 00:29:55.132580 sshd-session[5964]: pam_unix(sshd:session): session closed for user core Jan 15 00:29:55.133000 audit[5964]: USER_END pid=5964 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:55.138440 systemd[1]: sshd@45-10.0.3.29:22-20.161.92.111:48516.service: Deactivated successfully. Jan 15 00:29:55.142124 kernel: audit: type=1106 audit(1768436995.133:954): pid=5964 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:55.142215 kernel: audit: type=1104 audit(1768436995.133:955): pid=5964 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:55.133000 audit[5964]: CRED_DISP pid=5964 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:29:55.140245 systemd[1]: session-23.scope: Deactivated successfully. Jan 15 00:29:55.137000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-10.0.3.29:22-20.161.92.111:48516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:29:55.143049 systemd-logind[1681]: Session 23 logged out. Waiting for processes to exit. Jan 15 00:29:55.144006 systemd-logind[1681]: Removed session 23. Jan 15 00:29:55.873671 kubelet[2931]: E0115 00:29:55.873620 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:29:56.874989 kubelet[2931]: E0115 00:29:56.874696 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:29:58.873772 kubelet[2931]: E0115 00:29:58.873705 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:30:00.239881 systemd[1]: Started sshd@46-10.0.3.29:22-20.161.92.111:48530.service - OpenSSH per-connection server daemon (20.161.92.111:48530). Jan 15 00:30:00.243791 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 00:30:00.243820 kernel: audit: type=1130 audit(1768437000.238:957): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-10.0.3.29:22-20.161.92.111:48530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:00.238000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-10.0.3.29:22-20.161.92.111:48530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:00.754000 audit[5980]: USER_ACCT pid=5980 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:00.756038 sshd[5980]: Accepted publickey for core from 20.161.92.111 port 48530 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:30:00.758000 audit[5980]: CRED_ACQ pid=5980 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:00.760639 sshd-session[5980]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:30:00.765062 kernel: audit: type=1101 audit(1768437000.754:958): pid=5980 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:00.765156 kernel: audit: type=1103 audit(1768437000.758:959): pid=5980 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:00.765217 kernel: audit: type=1006 audit(1768437000.758:960): pid=5980 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 15 00:30:00.758000 audit[5980]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe7436aa0 a2=3 a3=0 items=0 ppid=1 pid=5980 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:00.770552 kernel: audit: type=1300 audit(1768437000.758:960): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe7436aa0 a2=3 a3=0 items=0 ppid=1 pid=5980 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:00.758000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:30:00.772243 kernel: audit: type=1327 audit(1768437000.758:960): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:30:00.774510 systemd-logind[1681]: New session 24 of user core. Jan 15 00:30:00.782519 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 15 00:30:00.784000 audit[5980]: USER_START pid=5980 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:00.789000 audit[5983]: CRED_ACQ pid=5983 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:00.793444 kernel: audit: type=1105 audit(1768437000.784:961): pid=5980 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:00.793557 kernel: audit: type=1103 audit(1768437000.789:962): pid=5983 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:01.110577 sshd[5983]: Connection closed by 20.161.92.111 port 48530 Jan 15 00:30:01.111030 sshd-session[5980]: pam_unix(sshd:session): session closed for user core Jan 15 00:30:01.111000 audit[5980]: USER_END pid=5980 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:01.114955 systemd[1]: sshd@46-10.0.3.29:22-20.161.92.111:48530.service: Deactivated successfully. Jan 15 00:30:01.116741 systemd[1]: session-24.scope: Deactivated successfully. Jan 15 00:30:01.118267 systemd-logind[1681]: Session 24 logged out. Waiting for processes to exit. Jan 15 00:30:01.111000 audit[5980]: CRED_DISP pid=5980 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:01.121951 kernel: audit: type=1106 audit(1768437001.111:963): pid=5980 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:01.122079 kernel: audit: type=1104 audit(1768437001.111:964): pid=5980 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:01.111000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-10.0.3.29:22-20.161.92.111:48530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:01.123221 systemd-logind[1681]: Removed session 24. Jan 15 00:30:05.873665 kubelet[2931]: E0115 00:30:05.873619 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:30:06.218070 systemd[1]: Started sshd@47-10.0.3.29:22-20.161.92.111:50666.service - OpenSSH per-connection server daemon (20.161.92.111:50666). Jan 15 00:30:06.217000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-10.0.3.29:22-20.161.92.111:50666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:06.219359 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 00:30:06.219449 kernel: audit: type=1130 audit(1768437006.217:966): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-10.0.3.29:22-20.161.92.111:50666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:06.748000 audit[5996]: USER_ACCT pid=5996 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:06.749869 sshd[5996]: Accepted publickey for core from 20.161.92.111 port 50666 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:30:06.753228 kernel: audit: type=1101 audit(1768437006.748:967): pid=5996 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:06.752000 audit[5996]: CRED_ACQ pid=5996 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:06.754214 sshd-session[5996]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:30:06.758938 kernel: audit: type=1103 audit(1768437006.752:968): pid=5996 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:06.759112 kernel: audit: type=1006 audit(1768437006.752:969): pid=5996 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 15 00:30:06.759229 kernel: audit: type=1300 audit(1768437006.752:969): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff5bf2100 a2=3 a3=0 items=0 ppid=1 pid=5996 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:06.752000 audit[5996]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff5bf2100 a2=3 a3=0 items=0 ppid=1 pid=5996 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:06.752000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:30:06.763914 kernel: audit: type=1327 audit(1768437006.752:969): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:30:06.767259 systemd-logind[1681]: New session 25 of user core. Jan 15 00:30:06.773374 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 15 00:30:06.775000 audit[5996]: USER_START pid=5996 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:06.779000 audit[5999]: CRED_ACQ pid=5999 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:06.783856 kernel: audit: type=1105 audit(1768437006.775:970): pid=5996 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:06.783988 kernel: audit: type=1103 audit(1768437006.779:971): pid=5999 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:06.872945 kubelet[2931]: E0115 00:30:06.872804 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:30:06.872945 kubelet[2931]: E0115 00:30:06.872864 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:30:07.110237 sshd[5999]: Connection closed by 20.161.92.111 port 50666 Jan 15 00:30:07.110968 sshd-session[5996]: pam_unix(sshd:session): session closed for user core Jan 15 00:30:07.112000 audit[5996]: USER_END pid=5996 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:07.117405 systemd[1]: sshd@47-10.0.3.29:22-20.161.92.111:50666.service: Deactivated successfully. Jan 15 00:30:07.112000 audit[5996]: CRED_DISP pid=5996 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:07.120651 systemd[1]: session-25.scope: Deactivated successfully. Jan 15 00:30:07.122832 systemd-logind[1681]: Session 25 logged out. Waiting for processes to exit. Jan 15 00:30:07.123347 kernel: audit: type=1106 audit(1768437007.112:972): pid=5996 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:07.123386 kernel: audit: type=1104 audit(1768437007.112:973): pid=5996 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:07.124011 systemd-logind[1681]: Removed session 25. Jan 15 00:30:07.116000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-10.0.3.29:22-20.161.92.111:50666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:07.779280 systemd[1]: Started sshd@48-10.0.3.29:22-146.190.22.154:59866.service - OpenSSH per-connection server daemon (146.190.22.154:59866). Jan 15 00:30:07.778000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-10.0.3.29:22-146.190.22.154:59866 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:08.536928 sshd[6014]: Connection closed by authenticating user root 146.190.22.154 port 59866 [preauth] Jan 15 00:30:08.536000 audit[6014]: USER_ERR pid=6014 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=146.190.22.154 addr=146.190.22.154 terminal=ssh res=failed' Jan 15 00:30:08.539652 systemd[1]: sshd@48-10.0.3.29:22-146.190.22.154:59866.service: Deactivated successfully. Jan 15 00:30:08.538000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-10.0.3.29:22-146.190.22.154:59866 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:09.873054 kubelet[2931]: E0115 00:30:09.872991 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:30:09.873468 kubelet[2931]: E0115 00:30:09.873369 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:30:12.219465 systemd[1]: Started sshd@49-10.0.3.29:22-20.161.92.111:50682.service - OpenSSH per-connection server daemon (20.161.92.111:50682). Jan 15 00:30:12.221199 kernel: kauditd_printk_skb: 4 callbacks suppressed Jan 15 00:30:12.221277 kernel: audit: type=1130 audit(1768437012.218:978): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-10.0.3.29:22-20.161.92.111:50682 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:12.218000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-10.0.3.29:22-20.161.92.111:50682 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:12.743000 audit[6046]: USER_ACCT pid=6046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:12.745142 sshd[6046]: Accepted publickey for core from 20.161.92.111 port 50682 ssh2: RSA SHA256:c7QV5+n9HxSDqJGaJpv/vl1ZFYfHOU2iqPn8SY1yls8 Jan 15 00:30:12.749229 kernel: audit: type=1101 audit(1768437012.743:979): pid=6046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:12.748000 audit[6046]: CRED_ACQ pid=6046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:12.749835 sshd-session[6046]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 00:30:12.754914 kernel: audit: type=1103 audit(1768437012.748:980): pid=6046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:12.755040 kernel: audit: type=1006 audit(1768437012.748:981): pid=6046 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 15 00:30:12.748000 audit[6046]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd60ceee0 a2=3 a3=0 items=0 ppid=1 pid=6046 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:12.759520 kernel: audit: type=1300 audit(1768437012.748:981): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd60ceee0 a2=3 a3=0 items=0 ppid=1 pid=6046 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:12.759581 kernel: audit: type=1327 audit(1768437012.748:981): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:30:12.748000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 00:30:12.759224 systemd-logind[1681]: New session 26 of user core. Jan 15 00:30:12.766518 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 15 00:30:12.768000 audit[6046]: USER_START pid=6046 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:12.773187 kernel: audit: type=1105 audit(1768437012.768:982): pid=6046 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:12.772000 audit[6049]: CRED_ACQ pid=6049 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:12.777215 kernel: audit: type=1103 audit(1768437012.772:983): pid=6049 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:12.874225 kubelet[2931]: E0115 00:30:12.874084 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:30:13.096248 sshd[6049]: Connection closed by 20.161.92.111 port 50682 Jan 15 00:30:13.097230 sshd-session[6046]: pam_unix(sshd:session): session closed for user core Jan 15 00:30:13.097000 audit[6046]: USER_END pid=6046 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:13.101329 systemd-logind[1681]: Session 26 logged out. Waiting for processes to exit. Jan 15 00:30:13.101525 systemd[1]: sshd@49-10.0.3.29:22-20.161.92.111:50682.service: Deactivated successfully. Jan 15 00:30:13.097000 audit[6046]: CRED_DISP pid=6046 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:13.105603 kernel: audit: type=1106 audit(1768437013.097:984): pid=6046 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:13.105689 kernel: audit: type=1104 audit(1768437013.097:985): pid=6046 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=20.161.92.111 addr=20.161.92.111 terminal=ssh res=success' Jan 15 00:30:13.101000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-10.0.3.29:22-20.161.92.111:50682 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:13.106098 systemd[1]: session-26.scope: Deactivated successfully. Jan 15 00:30:13.108413 systemd-logind[1681]: Removed session 26. Jan 15 00:30:18.874481 kubelet[2931]: E0115 00:30:18.874437 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:30:20.872633 kubelet[2931]: E0115 00:30:20.872526 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:30:21.873119 kubelet[2931]: E0115 00:30:21.873069 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:30:21.873624 kubelet[2931]: E0115 00:30:21.873211 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:30:22.874772 kubelet[2931]: E0115 00:30:22.874733 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:30:23.873119 kubelet[2931]: E0115 00:30:23.873071 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:30:29.873549 kubelet[2931]: E0115 00:30:29.873491 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:30:32.874187 kubelet[2931]: E0115 00:30:32.873960 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:30:34.873634 kubelet[2931]: E0115 00:30:34.873586 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:30:34.874604 kubelet[2931]: E0115 00:30:34.874566 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:30:35.873539 kubelet[2931]: E0115 00:30:35.873498 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:30:36.875476 kubelet[2931]: E0115 00:30:36.875433 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:30:42.874519 kubelet[2931]: E0115 00:30:42.874279 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-nn9k9" podUID="933e7fe5-e25e-48cf-938a-716b1fa3d838" Jan 15 00:30:43.208550 systemd[1]: cri-containerd-74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195.scope: Deactivated successfully. Jan 15 00:30:43.209448 systemd[1]: cri-containerd-74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195.scope: Consumed 1min 16.901s CPU time, 117.2M memory peak. Jan 15 00:30:43.210971 containerd[1703]: time="2026-01-15T00:30:43.210585675Z" level=info msg="received container exit event container_id:\"74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195\" id:\"74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195\" pid:3251 exit_status:1 exited_at:{seconds:1768437043 nanos:210118793}" Jan 15 00:30:43.213000 audit: BPF prog-id=146 op=UNLOAD Jan 15 00:30:43.215284 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 00:30:43.215392 kernel: audit: type=1334 audit(1768437043.213:987): prog-id=146 op=UNLOAD Jan 15 00:30:43.213000 audit: BPF prog-id=150 op=UNLOAD Jan 15 00:30:43.217244 kernel: audit: type=1334 audit(1768437043.213:988): prog-id=150 op=UNLOAD Jan 15 00:30:43.238019 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195-rootfs.mount: Deactivated successfully. Jan 15 00:30:43.576760 systemd[1]: cri-containerd-652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e.scope: Deactivated successfully. Jan 15 00:30:43.577111 systemd[1]: cri-containerd-652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e.scope: Consumed 5.991s CPU time, 64.7M memory peak. Jan 15 00:30:43.576000 audit: BPF prog-id=256 op=LOAD Jan 15 00:30:43.578899 containerd[1703]: time="2026-01-15T00:30:43.578704804Z" level=info msg="received container exit event container_id:\"652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e\" id:\"652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e\" pid:2776 exit_status:1 exited_at:{seconds:1768437043 nanos:578380803}" Jan 15 00:30:43.576000 audit: BPF prog-id=93 op=UNLOAD Jan 15 00:30:43.579984 kernel: audit: type=1334 audit(1768437043.576:989): prog-id=256 op=LOAD Jan 15 00:30:43.580146 kernel: audit: type=1334 audit(1768437043.576:990): prog-id=93 op=UNLOAD Jan 15 00:30:43.581000 audit: BPF prog-id=103 op=UNLOAD Jan 15 00:30:43.581000 audit: BPF prog-id=107 op=UNLOAD Jan 15 00:30:43.584328 kernel: audit: type=1334 audit(1768437043.581:991): prog-id=103 op=UNLOAD Jan 15 00:30:43.585253 kernel: audit: type=1334 audit(1768437043.581:992): prog-id=107 op=UNLOAD Jan 15 00:30:43.605548 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e-rootfs.mount: Deactivated successfully. Jan 15 00:30:43.667418 kubelet[2931]: E0115 00:30:43.667380 2931 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.29:37114->10.0.3.61:2379: read: connection timed out" Jan 15 00:30:44.005710 kubelet[2931]: I0115 00:30:44.005489 2931 scope.go:117] "RemoveContainer" containerID="652a3b325d5dfe852f4a64d18402e1991306d05b0ed0ee5ba8cbf183c48ea90e" Jan 15 00:30:44.007726 kubelet[2931]: I0115 00:30:44.007479 2931 scope.go:117] "RemoveContainer" containerID="74ed9079d102c426017b480e8e91698b5364387037d9ab8e0f02d77389836195" Jan 15 00:30:44.008045 containerd[1703]: time="2026-01-15T00:30:44.008007280Z" level=info msg="CreateContainer within sandbox \"08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 15 00:30:44.020651 containerd[1703]: time="2026-01-15T00:30:44.020570559Z" level=info msg="CreateContainer within sandbox \"6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 15 00:30:44.023987 containerd[1703]: time="2026-01-15T00:30:44.023937569Z" level=info msg="Container c4817e952955a34b8da15a5f40cb4808174b7c09d24b9bdff33eff3420154540: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:30:44.033980 containerd[1703]: time="2026-01-15T00:30:44.033884319Z" level=info msg="CreateContainer within sandbox \"08183a919e7ba618cae7f89a12b45b32c2bb86918246d01e17c63a93db3c9876\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"c4817e952955a34b8da15a5f40cb4808174b7c09d24b9bdff33eff3420154540\"" Jan 15 00:30:44.034920 containerd[1703]: time="2026-01-15T00:30:44.034285961Z" level=info msg="Container bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:30:44.035656 containerd[1703]: time="2026-01-15T00:30:44.035619005Z" level=info msg="StartContainer for \"c4817e952955a34b8da15a5f40cb4808174b7c09d24b9bdff33eff3420154540\"" Jan 15 00:30:44.037368 containerd[1703]: time="2026-01-15T00:30:44.037340370Z" level=info msg="connecting to shim c4817e952955a34b8da15a5f40cb4808174b7c09d24b9bdff33eff3420154540" address="unix:///run/containerd/s/f69039bc65a3950919086b0f712abb117d343e353eded86d44169ead0c9e2bf2" protocol=ttrpc version=3 Jan 15 00:30:44.041972 containerd[1703]: time="2026-01-15T00:30:44.041920744Z" level=info msg="CreateContainer within sandbox \"6db9f1f6a04d77876999e3dd848de1fd68714e4772dde678ded76bed332fd677\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b\"" Jan 15 00:30:44.042401 containerd[1703]: time="2026-01-15T00:30:44.042378826Z" level=info msg="StartContainer for \"bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b\"" Jan 15 00:30:44.043301 containerd[1703]: time="2026-01-15T00:30:44.043274148Z" level=info msg="connecting to shim bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b" address="unix:///run/containerd/s/b1a6abcd76c5109a1fefe0941d607c8a89b208f1c06a4a9b62e64e3e0f4b6700" protocol=ttrpc version=3 Jan 15 00:30:44.056378 systemd[1]: Started cri-containerd-c4817e952955a34b8da15a5f40cb4808174b7c09d24b9bdff33eff3420154540.scope - libcontainer container c4817e952955a34b8da15a5f40cb4808174b7c09d24b9bdff33eff3420154540. Jan 15 00:30:44.059705 systemd[1]: Started cri-containerd-bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b.scope - libcontainer container bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b. Jan 15 00:30:44.068000 audit: BPF prog-id=257 op=LOAD Jan 15 00:30:44.069000 audit: BPF prog-id=258 op=LOAD Jan 15 00:30:44.072240 kernel: audit: type=1334 audit(1768437044.068:993): prog-id=257 op=LOAD Jan 15 00:30:44.072305 kernel: audit: type=1334 audit(1768437044.069:994): prog-id=258 op=LOAD Jan 15 00:30:44.072331 kernel: audit: type=1300 audit(1768437044.069:994): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=2628 pid=6119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.069000 audit[6119]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=2628 pid=6119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.069000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334383137653935323935356133346238646131356135663430636234 Jan 15 00:30:44.080520 kernel: audit: type=1327 audit(1768437044.069:994): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334383137653935323935356133346238646131356135663430636234 Jan 15 00:30:44.070000 audit: BPF prog-id=258 op=UNLOAD Jan 15 00:30:44.070000 audit[6119]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2628 pid=6119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.070000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334383137653935323935356133346238646131356135663430636234 Jan 15 00:30:44.070000 audit: BPF prog-id=259 op=LOAD Jan 15 00:30:44.070000 audit[6119]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=2628 pid=6119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.070000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334383137653935323935356133346238646131356135663430636234 Jan 15 00:30:44.071000 audit: BPF prog-id=261 op=LOAD Jan 15 00:30:44.075000 audit: BPF prog-id=260 op=LOAD Jan 15 00:30:44.071000 audit[6119]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=2628 pid=6119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.071000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334383137653935323935356133346238646131356135663430636234 Jan 15 00:30:44.075000 audit: BPF prog-id=261 op=UNLOAD Jan 15 00:30:44.075000 audit[6119]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2628 pid=6119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.075000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334383137653935323935356133346238646131356135663430636234 Jan 15 00:30:44.075000 audit: BPF prog-id=259 op=UNLOAD Jan 15 00:30:44.075000 audit[6119]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2628 pid=6119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.075000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334383137653935323935356133346238646131356135663430636234 Jan 15 00:30:44.075000 audit: BPF prog-id=262 op=LOAD Jan 15 00:30:44.075000 audit[6119]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=2628 pid=6119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.075000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334383137653935323935356133346238646131356135663430636234 Jan 15 00:30:44.076000 audit: BPF prog-id=263 op=LOAD Jan 15 00:30:44.076000 audit[6125]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=3077 pid=6125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383662373830623462646531643563643831353161306235363566 Jan 15 00:30:44.080000 audit: BPF prog-id=263 op=UNLOAD Jan 15 00:30:44.080000 audit[6125]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3077 pid=6125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383662373830623462646531643563643831353161306235363566 Jan 15 00:30:44.080000 audit: BPF prog-id=264 op=LOAD Jan 15 00:30:44.080000 audit[6125]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3077 pid=6125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383662373830623462646531643563643831353161306235363566 Jan 15 00:30:44.080000 audit: BPF prog-id=265 op=LOAD Jan 15 00:30:44.080000 audit[6125]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3077 pid=6125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383662373830623462646531643563643831353161306235363566 Jan 15 00:30:44.080000 audit: BPF prog-id=265 op=UNLOAD Jan 15 00:30:44.080000 audit[6125]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3077 pid=6125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383662373830623462646531643563643831353161306235363566 Jan 15 00:30:44.080000 audit: BPF prog-id=264 op=UNLOAD Jan 15 00:30:44.080000 audit[6125]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3077 pid=6125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383662373830623462646531643563643831353161306235363566 Jan 15 00:30:44.080000 audit: BPF prog-id=266 op=LOAD Jan 15 00:30:44.080000 audit[6125]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3077 pid=6125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:44.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383662373830623462646531643563643831353161306235363566 Jan 15 00:30:44.101032 containerd[1703]: time="2026-01-15T00:30:44.100982165Z" level=info msg="StartContainer for \"bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b\" returns successfully" Jan 15 00:30:44.116470 containerd[1703]: time="2026-01-15T00:30:44.116432013Z" level=info msg="StartContainer for \"c4817e952955a34b8da15a5f40cb4808174b7c09d24b9bdff33eff3420154540\" returns successfully" Jan 15 00:30:44.238619 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1486878906.mount: Deactivated successfully. Jan 15 00:30:46.873797 kubelet[2931]: E0115 00:30:46.873756 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-d7bdbd78b-v4vh7" podUID="782417a5-ecd0-40c5-85c0-45ead5d347fd" Jan 15 00:30:46.874915 kubelet[2931]: E0115 00:30:46.874839 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-92nsn" podUID="e8af8aef-db47-4bb0-9303-531f44a2593e" Jan 15 00:30:47.268000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-10.0.3.29:22-146.190.22.154:50364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:47.269116 systemd[1]: Started sshd@50-10.0.3.29:22-146.190.22.154:50364.service - OpenSSH per-connection server daemon (146.190.22.154:50364). Jan 15 00:30:47.354421 sshd[6186]: Connection closed by authenticating user root 146.190.22.154 port 50364 [preauth] Jan 15 00:30:47.353000 audit[6186]: USER_ERR pid=6186 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=146.190.22.154 addr=146.190.22.154 terminal=ssh res=failed' Jan 15 00:30:47.357279 systemd[1]: sshd@50-10.0.3.29:22-146.190.22.154:50364.service: Deactivated successfully. Jan 15 00:30:47.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-10.0.3.29:22-146.190.22.154:50364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 00:30:47.873323 kubelet[2931]: E0115 00:30:47.873266 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-955f9fbff-rtvzr" podUID="14a4f6b6-e857-4a63-b075-14b068610222" Jan 15 00:30:48.403902 kubelet[2931]: I0115 00:30:48.403856 2931 status_manager.go:890] "Failed to get status for pod" podUID="921845c7fba3dc0759018a9a18178d42" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-1ddc109f0f" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.29:37036->10.0.3.61:2379: read: connection timed out" Jan 15 00:30:48.404980 kubelet[2931]: E0115 00:30:48.404852 2931 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.3.29:36948->10.0.3.61:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4515-1-0-n-1ddc109f0f.188ac01de3f7025d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4515-1-0-n-1ddc109f0f,UID:8bc3aec537de54b0a44b57386bb39227,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-n-1ddc109f0f,},FirstTimestamp:2026-01-15 00:30:38.307541597 +0000 UTC m=+479.525124944,LastTimestamp:2026-01-15 00:30:38.307541597 +0000 UTC m=+479.525124944,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-n-1ddc109f0f,}" Jan 15 00:30:48.873439 kubelet[2931]: E0115 00:30:48.873387 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-cd46w" podUID="576324d0-4c45-424a-9f35-c0de23b9b1ac" Jan 15 00:30:49.178755 systemd[1]: cri-containerd-66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a.scope: Deactivated successfully. Jan 15 00:30:49.179130 systemd[1]: cri-containerd-66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a.scope: Consumed 6.278s CPU time, 26.4M memory peak. Jan 15 00:30:49.179000 audit: BPF prog-id=267 op=LOAD Jan 15 00:30:49.181334 containerd[1703]: time="2026-01-15T00:30:49.181281745Z" level=info msg="received container exit event container_id:\"66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a\" id:\"66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a\" pid:2788 exit_status:1 exited_at:{seconds:1768437049 nanos:180523183}" Jan 15 00:30:49.181675 kernel: kauditd_printk_skb: 43 callbacks suppressed Jan 15 00:30:49.181908 kernel: audit: type=1334 audit(1768437049.179:1012): prog-id=267 op=LOAD Jan 15 00:30:49.181944 kernel: audit: type=1334 audit(1768437049.179:1013): prog-id=83 op=UNLOAD Jan 15 00:30:49.179000 audit: BPF prog-id=83 op=UNLOAD Jan 15 00:30:49.183000 audit: BPF prog-id=108 op=UNLOAD Jan 15 00:30:49.183000 audit: BPF prog-id=112 op=UNLOAD Jan 15 00:30:49.186051 kernel: audit: type=1334 audit(1768437049.183:1014): prog-id=108 op=UNLOAD Jan 15 00:30:49.186098 kernel: audit: type=1334 audit(1768437049.183:1015): prog-id=112 op=UNLOAD Jan 15 00:30:49.202641 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a-rootfs.mount: Deactivated successfully. Jan 15 00:30:50.028194 kubelet[2931]: I0115 00:30:50.026904 2931 scope.go:117] "RemoveContainer" containerID="66af349207e414b22989e05e0739947ad935fd5a4bbb0d2a5acff2e011cd964a" Jan 15 00:30:50.029306 containerd[1703]: time="2026-01-15T00:30:50.029255586Z" level=info msg="CreateContainer within sandbox \"0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 15 00:30:50.041978 containerd[1703]: time="2026-01-15T00:30:50.041936984Z" level=info msg="Container a9219d792c3495fc46574c4bd9c426aed8b042ca1b6b6d14a2668103989a1af2: CDI devices from CRI Config.CDIDevices: []" Jan 15 00:30:50.050131 containerd[1703]: time="2026-01-15T00:30:50.050080569Z" level=info msg="CreateContainer within sandbox \"0c4540d0d1351112a68d38f219412562c5fedeaa7d0a6c731ee0cea6fe24e24b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"a9219d792c3495fc46574c4bd9c426aed8b042ca1b6b6d14a2668103989a1af2\"" Jan 15 00:30:50.050600 containerd[1703]: time="2026-01-15T00:30:50.050546731Z" level=info msg="StartContainer for \"a9219d792c3495fc46574c4bd9c426aed8b042ca1b6b6d14a2668103989a1af2\"" Jan 15 00:30:50.051707 containerd[1703]: time="2026-01-15T00:30:50.051672174Z" level=info msg="connecting to shim a9219d792c3495fc46574c4bd9c426aed8b042ca1b6b6d14a2668103989a1af2" address="unix:///run/containerd/s/446433a4535420d1034752d1284badbc2a2f0604c439763c17cae9ac6aa8e33c" protocol=ttrpc version=3 Jan 15 00:30:50.071437 systemd[1]: Started cri-containerd-a9219d792c3495fc46574c4bd9c426aed8b042ca1b6b6d14a2668103989a1af2.scope - libcontainer container a9219d792c3495fc46574c4bd9c426aed8b042ca1b6b6d14a2668103989a1af2. Jan 15 00:30:50.081000 audit: BPF prog-id=268 op=LOAD Jan 15 00:30:50.082000 audit: BPF prog-id=269 op=LOAD Jan 15 00:30:50.084754 kernel: audit: type=1334 audit(1768437050.081:1016): prog-id=268 op=LOAD Jan 15 00:30:50.084836 kernel: audit: type=1334 audit(1768437050.082:1017): prog-id=269 op=LOAD Jan 15 00:30:50.084864 kernel: audit: type=1300 audit(1768437050.082:1017): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=2585 pid=6212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:50.082000 audit[6212]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=2585 pid=6212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:50.082000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139323139643739326333343935666334363537346334626439633432 Jan 15 00:30:50.083000 audit: BPF prog-id=269 op=UNLOAD Jan 15 00:30:50.092188 kernel: audit: type=1327 audit(1768437050.082:1017): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139323139643739326333343935666334363537346334626439633432 Jan 15 00:30:50.092284 kernel: audit: type=1334 audit(1768437050.083:1018): prog-id=269 op=UNLOAD Jan 15 00:30:50.092319 kernel: audit: type=1300 audit(1768437050.083:1018): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=6212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:50.083000 audit[6212]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=6212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:50.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139323139643739326333343935666334363537346334626439633432 Jan 15 00:30:50.083000 audit: BPF prog-id=270 op=LOAD Jan 15 00:30:50.083000 audit[6212]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=2585 pid=6212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:50.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139323139643739326333343935666334363537346334626439633432 Jan 15 00:30:50.083000 audit: BPF prog-id=271 op=LOAD Jan 15 00:30:50.083000 audit[6212]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=2585 pid=6212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:50.083000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139323139643739326333343935666334363537346334626439633432 Jan 15 00:30:50.086000 audit: BPF prog-id=271 op=UNLOAD Jan 15 00:30:50.086000 audit[6212]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=6212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:50.086000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139323139643739326333343935666334363537346334626439633432 Jan 15 00:30:50.087000 audit: BPF prog-id=270 op=UNLOAD Jan 15 00:30:50.087000 audit[6212]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2585 pid=6212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:50.087000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139323139643739326333343935666334363537346334626439633432 Jan 15 00:30:50.087000 audit: BPF prog-id=272 op=LOAD Jan 15 00:30:50.087000 audit[6212]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=2585 pid=6212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 00:30:50.087000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139323139643739326333343935666334363537346334626439633432 Jan 15 00:30:50.120601 containerd[1703]: time="2026-01-15T00:30:50.120489065Z" level=info msg="StartContainer for \"a9219d792c3495fc46574c4bd9c426aed8b042ca1b6b6d14a2668103989a1af2\" returns successfully" Jan 15 00:30:50.873102 kubelet[2931]: E0115 00:30:50.872968 2931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7877d5fb5-c885v" podUID="051b417e-bac4-4f72-8b07-3775d126567f" Jan 15 00:30:53.668087 kubelet[2931]: E0115 00:30:53.667996 2931 controller.go:195] "Failed to update lease" err="Put \"https://10.0.3.29:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-1ddc109f0f?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 15 00:30:55.287633 systemd[1]: cri-containerd-bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b.scope: Deactivated successfully. Jan 15 00:30:55.289271 containerd[1703]: time="2026-01-15T00:30:55.289232916Z" level=info msg="received container exit event container_id:\"bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b\" id:\"bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b\" pid:6150 exit_status:1 exited_at:{seconds:1768437055 nanos:288165833}" Jan 15 00:30:55.290000 audit: BPF prog-id=260 op=UNLOAD Jan 15 00:30:55.292814 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 15 00:30:55.292872 kernel: audit: type=1334 audit(1768437055.290:1024): prog-id=260 op=UNLOAD Jan 15 00:30:55.292892 kernel: audit: type=1334 audit(1768437055.290:1025): prog-id=266 op=UNLOAD Jan 15 00:30:55.290000 audit: BPF prog-id=266 op=UNLOAD Jan 15 00:30:55.307586 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bb86b780b4bde1d5cd8151a0b565f610c602f43d38d465944eae20de4d1d032b-rootfs.mount: Deactivated successfully.