Jan 19 09:24:35.314831 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 18:44:02 -00 2026 Jan 19 09:24:35.314863 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e880b5400e832e1de59b993d9ba6b86a9089175f10b4985da8b7b47cc8c74099 Jan 19 09:24:35.314877 kernel: BIOS-provided physical RAM map: Jan 19 09:24:35.314888 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 19 09:24:35.314901 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Jan 19 09:24:35.314911 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Jan 19 09:24:35.314922 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Jan 19 09:24:35.314932 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 19 09:24:35.314942 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 19 09:24:35.314952 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 19 09:24:35.314961 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Jan 19 09:24:35.314971 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Jan 19 09:24:35.314985 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 19 09:24:35.314995 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 19 09:24:35.315007 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 19 09:24:35.315017 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Jan 19 09:24:35.315027 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jan 19 09:24:35.315042 kernel: NX (Execute Disable) protection: active Jan 19 09:24:35.315052 kernel: APIC: Static calls initialized Jan 19 09:24:35.315062 kernel: e820: update [mem 0x7dfac018-0x7dfb5a57] usable ==> usable Jan 19 09:24:35.315073 kernel: e820: update [mem 0x7df70018-0x7dfab657] usable ==> usable Jan 19 09:24:35.315083 kernel: e820: update [mem 0x7df34018-0x7df6f657] usable ==> usable Jan 19 09:24:35.315093 kernel: extended physical RAM map: Jan 19 09:24:35.315104 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 19 09:24:35.315114 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000007df34017] usable Jan 19 09:24:35.315124 kernel: reserve setup_data: [mem 0x000000007df34018-0x000000007df6f657] usable Jan 19 09:24:35.315134 kernel: reserve setup_data: [mem 0x000000007df6f658-0x000000007df70017] usable Jan 19 09:24:35.315149 kernel: reserve setup_data: [mem 0x000000007df70018-0x000000007dfab657] usable Jan 19 09:24:35.315159 kernel: reserve setup_data: [mem 0x000000007dfab658-0x000000007dfac017] usable Jan 19 09:24:35.315169 kernel: reserve setup_data: [mem 0x000000007dfac018-0x000000007dfb5a57] usable Jan 19 09:24:35.315179 kernel: reserve setup_data: [mem 0x000000007dfb5a58-0x000000007ed3efff] usable Jan 19 09:24:35.315190 kernel: reserve setup_data: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Jan 19 09:24:35.315200 kernel: reserve setup_data: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Jan 19 09:24:35.315210 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 19 09:24:35.315220 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 19 09:24:35.315231 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 19 09:24:35.315241 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Jan 19 09:24:35.315251 kernel: reserve setup_data: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Jan 19 09:24:35.315266 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 19 09:24:35.315276 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 19 09:24:35.315293 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 19 09:24:35.315303 kernel: reserve setup_data: [mem 0x0000000100000000-0x0000000179ffffff] usable Jan 19 09:24:35.315318 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jan 19 09:24:35.315329 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jan 19 09:24:35.315340 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e00c198 RNG=0x7fb73018 Jan 19 09:24:35.315351 kernel: random: crng init done Jan 19 09:24:35.315362 kernel: efi: Remove mem136: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 19 09:24:35.315373 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 19 09:24:35.315384 kernel: secureboot: Secure boot disabled Jan 19 09:24:35.315394 kernel: SMBIOS 3.0.0 present. Jan 19 09:24:35.315405 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Jan 19 09:24:35.315420 kernel: DMI: Memory slots populated: 1/1 Jan 19 09:24:35.315430 kernel: Hypervisor detected: KVM Jan 19 09:24:35.315481 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Jan 19 09:24:35.315491 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 19 09:24:35.315502 kernel: kvm-clock: using sched offset of 13044842867 cycles Jan 19 09:24:35.315513 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 19 09:24:35.315525 kernel: tsc: Detected 2399.998 MHz processor Jan 19 09:24:35.315537 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 19 09:24:35.315549 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 19 09:24:35.315560 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Jan 19 09:24:35.315576 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 19 09:24:35.315588 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 19 09:24:35.315599 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Jan 19 09:24:35.315610 kernel: Using GB pages for direct mapping Jan 19 09:24:35.315621 kernel: ACPI: Early table checksum verification disabled Jan 19 09:24:35.315633 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 19 09:24:35.315644 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Jan 19 09:24:35.315660 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 19 09:24:35.315671 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 19 09:24:35.315682 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 19 09:24:35.315694 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 19 09:24:35.315705 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 19 09:24:35.315716 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 19 09:24:35.315727 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 19 09:24:35.315743 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 19 09:24:35.315754 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Jan 19 09:24:35.315765 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Jan 19 09:24:35.315777 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 19 09:24:35.315788 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Jan 19 09:24:35.315799 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Jan 19 09:24:35.315810 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Jan 19 09:24:35.315825 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Jan 19 09:24:35.315836 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Jan 19 09:24:35.315847 kernel: No NUMA configuration found Jan 19 09:24:35.315858 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Jan 19 09:24:35.315869 kernel: NODE_DATA(0) allocated [mem 0x179ff6dc0-0x179ffdfff] Jan 19 09:24:35.315881 kernel: Zone ranges: Jan 19 09:24:35.315892 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 19 09:24:35.315903 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 19 09:24:35.315919 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Jan 19 09:24:35.315930 kernel: Device empty Jan 19 09:24:35.315941 kernel: Movable zone start for each node Jan 19 09:24:35.315952 kernel: Early memory node ranges Jan 19 09:24:35.315963 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 19 09:24:35.315974 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Jan 19 09:24:35.315986 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Jan 19 09:24:35.315997 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Jan 19 09:24:35.316012 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Jan 19 09:24:35.316023 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Jan 19 09:24:35.316035 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 19 09:24:35.316046 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 19 09:24:35.316057 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 19 09:24:35.316069 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 19 09:24:35.316080 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Jan 19 09:24:35.316095 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jan 19 09:24:35.316107 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 19 09:24:35.316118 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 19 09:24:35.316129 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 19 09:24:35.316141 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 19 09:24:35.316152 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 19 09:24:35.316163 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 19 09:24:35.316179 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 19 09:24:35.316190 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 19 09:24:35.316201 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 19 09:24:35.316213 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Jan 19 09:24:35.316224 kernel: CPU topo: Max. logical packages: 1 Jan 19 09:24:35.316235 kernel: CPU topo: Max. logical dies: 1 Jan 19 09:24:35.316261 kernel: CPU topo: Max. dies per package: 1 Jan 19 09:24:35.316273 kernel: CPU topo: Max. threads per core: 1 Jan 19 09:24:35.316285 kernel: CPU topo: Num. cores per package: 2 Jan 19 09:24:35.316296 kernel: CPU topo: Num. threads per package: 2 Jan 19 09:24:35.316312 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 19 09:24:35.316323 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 19 09:24:35.316335 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 19 09:24:35.316347 kernel: Booting paravirtualized kernel on KVM Jan 19 09:24:35.316359 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 19 09:24:35.316375 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 19 09:24:35.316386 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 19 09:24:35.316398 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 19 09:24:35.316410 kernel: pcpu-alloc: [0] 0 1 Jan 19 09:24:35.316421 kernel: kvm-guest: PV spinlocks disabled, no host support Jan 19 09:24:35.316463 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e880b5400e832e1de59b993d9ba6b86a9089175f10b4985da8b7b47cc8c74099 Jan 19 09:24:35.316480 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 19 09:24:35.316492 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 19 09:24:35.316503 kernel: Fallback order for Node 0: 0 Jan 19 09:24:35.316515 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1022792 Jan 19 09:24:35.316526 kernel: Policy zone: Normal Jan 19 09:24:35.316538 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 19 09:24:35.316550 kernel: software IO TLB: area num 2. Jan 19 09:24:35.316565 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 19 09:24:35.316577 kernel: ftrace: allocating 40128 entries in 157 pages Jan 19 09:24:35.316589 kernel: ftrace: allocated 157 pages with 5 groups Jan 19 09:24:35.316600 kernel: Dynamic Preempt: voluntary Jan 19 09:24:35.316612 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 19 09:24:35.316630 kernel: rcu: RCU event tracing is enabled. Jan 19 09:24:35.316643 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 19 09:24:35.316655 kernel: Trampoline variant of Tasks RCU enabled. Jan 19 09:24:35.316671 kernel: Rude variant of Tasks RCU enabled. Jan 19 09:24:35.316683 kernel: Tracing variant of Tasks RCU enabled. Jan 19 09:24:35.316694 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 19 09:24:35.316706 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 19 09:24:35.316718 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 19 09:24:35.316730 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 19 09:24:35.316742 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 19 09:24:35.316758 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 19 09:24:35.316770 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 19 09:24:35.316781 kernel: Console: colour dummy device 80x25 Jan 19 09:24:35.316793 kernel: printk: legacy console [tty0] enabled Jan 19 09:24:35.316805 kernel: printk: legacy console [ttyS0] enabled Jan 19 09:24:35.316816 kernel: ACPI: Core revision 20240827 Jan 19 09:24:35.316828 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Jan 19 09:24:35.316845 kernel: APIC: Switch to symmetric I/O mode setup Jan 19 09:24:35.316856 kernel: x2apic enabled Jan 19 09:24:35.316868 kernel: APIC: Switched APIC routing to: physical x2apic Jan 19 09:24:35.316880 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 19 09:24:35.316892 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jan 19 09:24:35.316904 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Jan 19 09:24:35.316916 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 19 09:24:35.316932 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Jan 19 09:24:35.316943 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Jan 19 09:24:35.316955 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 19 09:24:35.316967 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 19 09:24:35.316979 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 19 09:24:35.316990 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 19 09:24:35.317002 kernel: active return thunk: srso_alias_return_thunk Jan 19 09:24:35.317018 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Jan 19 09:24:35.317030 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Jan 19 09:24:35.317042 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Jan 19 09:24:35.317053 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 19 09:24:35.317065 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 19 09:24:35.317077 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 19 09:24:35.317089 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 19 09:24:35.317105 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 19 09:24:35.317117 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 19 09:24:35.317129 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 19 09:24:35.317140 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 19 09:24:35.317152 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 19 09:24:35.317164 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 19 09:24:35.317175 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 19 09:24:35.317191 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 19 09:24:35.317203 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 19 09:24:35.317215 kernel: Freeing SMP alternatives memory: 32K Jan 19 09:24:35.317226 kernel: pid_max: default: 32768 minimum: 301 Jan 19 09:24:35.317238 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 19 09:24:35.317250 kernel: landlock: Up and running. Jan 19 09:24:35.317262 kernel: SELinux: Initializing. Jan 19 09:24:35.317278 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 19 09:24:35.317289 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 19 09:24:35.317301 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Jan 19 09:24:35.317313 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Jan 19 09:24:35.317325 kernel: ... version: 0 Jan 19 09:24:35.317337 kernel: ... bit width: 48 Jan 19 09:24:35.317348 kernel: ... generic registers: 6 Jan 19 09:24:35.317364 kernel: ... value mask: 0000ffffffffffff Jan 19 09:24:35.317375 kernel: ... max period: 00007fffffffffff Jan 19 09:24:35.317387 kernel: ... fixed-purpose events: 0 Jan 19 09:24:35.317399 kernel: ... event mask: 000000000000003f Jan 19 09:24:35.317411 kernel: signal: max sigframe size: 3376 Jan 19 09:24:35.317422 kernel: rcu: Hierarchical SRCU implementation. Jan 19 09:24:35.317461 kernel: rcu: Max phase no-delay instances is 400. Jan 19 09:24:35.317473 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 19 09:24:35.317489 kernel: smp: Bringing up secondary CPUs ... Jan 19 09:24:35.317500 kernel: smpboot: x86: Booting SMP configuration: Jan 19 09:24:35.317512 kernel: .... node #0, CPUs: #1 Jan 19 09:24:35.317524 kernel: smp: Brought up 1 node, 2 CPUs Jan 19 09:24:35.317536 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Jan 19 09:24:35.317548 kernel: Memory: 3873088K/4091168K available (14336K kernel code, 2445K rwdata, 31644K rodata, 15536K init, 2500K bss, 212448K reserved, 0K cma-reserved) Jan 19 09:24:35.317560 kernel: devtmpfs: initialized Jan 19 09:24:35.317576 kernel: x86/mm: Memory block size: 128MB Jan 19 09:24:35.317588 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 19 09:24:35.317599 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 19 09:24:35.317611 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 19 09:24:35.317623 kernel: pinctrl core: initialized pinctrl subsystem Jan 19 09:24:35.317635 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 19 09:24:35.317647 kernel: audit: initializing netlink subsys (disabled) Jan 19 09:24:35.317663 kernel: audit: type=2000 audit(1768814672.206:1): state=initialized audit_enabled=0 res=1 Jan 19 09:24:35.317675 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 19 09:24:35.317686 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 19 09:24:35.317698 kernel: cpuidle: using governor menu Jan 19 09:24:35.317710 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 19 09:24:35.317722 kernel: dca service started, version 1.12.1 Jan 19 09:24:35.317734 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 19 09:24:35.317750 kernel: PCI: Using configuration type 1 for base access Jan 19 09:24:35.317762 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 19 09:24:35.317774 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 19 09:24:35.317785 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 19 09:24:35.317797 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 19 09:24:35.317809 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 19 09:24:35.317821 kernel: ACPI: Added _OSI(Module Device) Jan 19 09:24:35.317836 kernel: ACPI: Added _OSI(Processor Device) Jan 19 09:24:35.317848 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 19 09:24:35.317860 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 19 09:24:35.317871 kernel: ACPI: Interpreter enabled Jan 19 09:24:35.317883 kernel: ACPI: PM: (supports S0 S5) Jan 19 09:24:35.317895 kernel: ACPI: Using IOAPIC for interrupt routing Jan 19 09:24:35.317907 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 19 09:24:35.317923 kernel: PCI: Using E820 reservations for host bridge windows Jan 19 09:24:35.317935 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 19 09:24:35.317947 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 19 09:24:35.318328 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 19 09:24:35.318662 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 19 09:24:35.318958 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 19 09:24:35.318979 kernel: PCI host bridge to bus 0000:00 Jan 19 09:24:35.319261 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 19 09:24:35.319560 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 19 09:24:35.319808 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 19 09:24:35.319939 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 19 09:24:35.320069 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 19 09:24:35.320202 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Jan 19 09:24:35.320332 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 19 09:24:35.320508 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 19 09:24:35.320659 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 19 09:24:35.320802 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 19 09:24:35.320945 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc060500000-0xc060503fff 64bit pref] Jan 19 09:24:35.321089 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8138a000-0x8138afff] Jan 19 09:24:35.321228 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 19 09:24:35.321370 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 19 09:24:35.321535 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 19 09:24:35.321677 kernel: pci 0000:00:02.0: BAR 0 [mem 0x81389000-0x81389fff] Jan 19 09:24:35.321820 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 19 09:24:35.321959 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Jan 19 09:24:35.322099 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 19 09:24:35.322245 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 19 09:24:35.322385 kernel: pci 0000:00:02.1: BAR 0 [mem 0x81388000-0x81388fff] Jan 19 09:24:35.322555 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 19 09:24:35.322698 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Jan 19 09:24:35.322847 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 19 09:24:35.322987 kernel: pci 0000:00:02.2: BAR 0 [mem 0x81387000-0x81387fff] Jan 19 09:24:35.323126 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 19 09:24:35.323266 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Jan 19 09:24:35.323464 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 19 09:24:35.323610 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 19 09:24:35.323754 kernel: pci 0000:00:02.3: BAR 0 [mem 0x81386000-0x81386fff] Jan 19 09:24:35.323894 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 19 09:24:35.324033 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 19 09:24:35.324370 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 19 09:24:35.324542 kernel: pci 0000:00:02.4: BAR 0 [mem 0x81385000-0x81385fff] Jan 19 09:24:35.324707 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 19 09:24:35.324862 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Jan 19 09:24:35.325005 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 19 09:24:35.325155 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 19 09:24:35.325296 kernel: pci 0000:00:02.5: BAR 0 [mem 0x81384000-0x81384fff] Jan 19 09:24:35.325465 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 19 09:24:35.325609 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Jan 19 09:24:35.325748 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 19 09:24:35.325893 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 19 09:24:35.326033 kernel: pci 0000:00:02.6: BAR 0 [mem 0x81383000-0x81383fff] Jan 19 09:24:35.326172 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 19 09:24:35.326314 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Jan 19 09:24:35.326843 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 19 09:24:35.327121 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 19 09:24:35.327498 kernel: pci 0000:00:02.7: BAR 0 [mem 0x81382000-0x81382fff] Jan 19 09:24:35.329484 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 19 09:24:35.329649 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Jan 19 09:24:35.329794 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 19 09:24:35.329946 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 19 09:24:35.330087 kernel: pci 0000:00:03.0: BAR 0 [mem 0x81381000-0x81381fff] Jan 19 09:24:35.330228 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 19 09:24:35.330367 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Jan 19 09:24:35.330527 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 19 09:24:35.330673 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 19 09:24:35.330812 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 19 09:24:35.330957 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 19 09:24:35.331098 kernel: pci 0000:00:1f.2: BAR 4 [io 0x6040-0x605f] Jan 19 09:24:35.331241 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x81380000-0x81380fff] Jan 19 09:24:35.331392 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 19 09:24:35.331547 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6000-0x603f] Jan 19 09:24:35.331698 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 19 09:24:35.331866 kernel: pci 0000:01:00.0: BAR 1 [mem 0x81200000-0x81200fff] Jan 19 09:24:35.332012 kernel: pci 0000:01:00.0: BAR 4 [mem 0xc060000000-0xc060003fff 64bit pref] Jan 19 09:24:35.332159 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 19 09:24:35.332300 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 19 09:24:35.333524 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 19 09:24:35.333685 kernel: pci 0000:02:00.0: BAR 0 [mem 0x81100000-0x81103fff 64bit] Jan 19 09:24:35.333829 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 19 09:24:35.333982 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jan 19 09:24:35.334131 kernel: pci 0000:03:00.0: BAR 1 [mem 0x81000000-0x81000fff] Jan 19 09:24:35.334277 kernel: pci 0000:03:00.0: BAR 4 [mem 0xc060100000-0xc060103fff 64bit pref] Jan 19 09:24:35.334419 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 19 09:24:35.334605 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 19 09:24:35.334751 kernel: pci 0000:04:00.0: BAR 4 [mem 0xc060200000-0xc060203fff 64bit pref] Jan 19 09:24:35.334895 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 19 09:24:35.335046 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 19 09:24:35.335190 kernel: pci 0000:05:00.0: BAR 1 [mem 0x80f00000-0x80f00fff] Jan 19 09:24:35.335333 kernel: pci 0000:05:00.0: BAR 4 [mem 0xc060300000-0xc060303fff 64bit pref] Jan 19 09:24:35.335520 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 19 09:24:35.335674 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jan 19 09:24:35.335821 kernel: pci 0000:06:00.0: BAR 1 [mem 0x80e00000-0x80e00fff] Jan 19 09:24:35.335965 kernel: pci 0000:06:00.0: BAR 4 [mem 0xc060400000-0xc060403fff 64bit pref] Jan 19 09:24:35.336104 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 19 09:24:35.336113 kernel: acpiphp: Slot [0] registered Jan 19 09:24:35.336262 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 19 09:24:35.336406 kernel: pci 0000:07:00.0: BAR 1 [mem 0x80c00000-0x80c00fff] Jan 19 09:24:35.336571 kernel: pci 0000:07:00.0: BAR 4 [mem 0xc000000000-0xc000003fff 64bit pref] Jan 19 09:24:35.336715 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 19 09:24:35.336856 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 19 09:24:35.336865 kernel: acpiphp: Slot [0-2] registered Jan 19 09:24:35.337002 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 19 09:24:35.337010 kernel: acpiphp: Slot [0-3] registered Jan 19 09:24:35.337151 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 19 09:24:35.337161 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 19 09:24:35.337178 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 19 09:24:35.337187 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 19 09:24:35.337193 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 19 09:24:35.337200 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 19 09:24:35.337208 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 19 09:24:35.337214 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 19 09:24:35.337221 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 19 09:24:35.337227 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 19 09:24:35.337233 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 19 09:24:35.337240 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 19 09:24:35.337246 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 19 09:24:35.337252 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 19 09:24:35.337261 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 19 09:24:35.337267 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 19 09:24:35.337276 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 19 09:24:35.337282 kernel: iommu: Default domain type: Translated Jan 19 09:24:35.337291 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 19 09:24:35.337297 kernel: efivars: Registered efivars operations Jan 19 09:24:35.337303 kernel: PCI: Using ACPI for IRQ routing Jan 19 09:24:35.337310 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 19 09:24:35.337316 kernel: e820: reserve RAM buffer [mem 0x7df34018-0x7fffffff] Jan 19 09:24:35.337323 kernel: e820: reserve RAM buffer [mem 0x7df70018-0x7fffffff] Jan 19 09:24:35.337329 kernel: e820: reserve RAM buffer [mem 0x7dfac018-0x7fffffff] Jan 19 09:24:35.337337 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Jan 19 09:24:35.337343 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 19 09:24:35.337350 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Jan 19 09:24:35.337356 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Jan 19 09:24:35.337542 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 19 09:24:35.337735 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 19 09:24:35.337975 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 19 09:24:35.337987 kernel: vgaarb: loaded Jan 19 09:24:35.337994 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Jan 19 09:24:35.338002 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Jan 19 09:24:35.338009 kernel: clocksource: Switched to clocksource kvm-clock Jan 19 09:24:35.338015 kernel: VFS: Disk quotas dquot_6.6.0 Jan 19 09:24:35.338022 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 19 09:24:35.338047 kernel: pnp: PnP ACPI init Jan 19 09:24:35.338206 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 19 09:24:35.338215 kernel: pnp: PnP ACPI: found 5 devices Jan 19 09:24:35.338222 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 19 09:24:35.338228 kernel: NET: Registered PF_INET protocol family Jan 19 09:24:35.338235 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 19 09:24:35.338241 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 19 09:24:35.338250 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 19 09:24:35.338257 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 19 09:24:35.338263 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 19 09:24:35.338269 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 19 09:24:35.338276 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 19 09:24:35.338282 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 19 09:24:35.338288 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 19 09:24:35.338297 kernel: NET: Registered PF_XDP protocol family Jan 19 09:24:35.338468 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 19 09:24:35.338616 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 19 09:24:35.338764 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 19 09:24:35.338927 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 19 09:24:35.339069 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 19 09:24:35.339210 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Jan 19 09:24:35.339354 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Jan 19 09:24:35.339540 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Jan 19 09:24:35.343951 kernel: pci 0000:01:00.0: ROM [mem 0x81280000-0x812fffff pref]: assigned Jan 19 09:24:35.344119 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 19 09:24:35.344264 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Jan 19 09:24:35.344405 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 19 09:24:35.344571 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 19 09:24:35.344716 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Jan 19 09:24:35.344859 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 19 09:24:35.344998 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Jan 19 09:24:35.345153 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 19 09:24:35.345295 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 19 09:24:35.345465 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 19 09:24:35.345608 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 19 09:24:35.345793 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Jan 19 09:24:35.345934 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 19 09:24:35.346074 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 19 09:24:35.346214 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Jan 19 09:24:35.346353 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 19 09:24:35.346524 kernel: pci 0000:07:00.0: ROM [mem 0x80c80000-0x80cfffff pref]: assigned Jan 19 09:24:35.346666 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 19 09:24:35.346931 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Jan 19 09:24:35.347073 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Jan 19 09:24:35.347217 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 19 09:24:35.347358 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 19 09:24:35.348565 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Jan 19 09:24:35.348722 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Jan 19 09:24:35.348866 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 19 09:24:35.349008 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 19 09:24:35.349148 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Jan 19 09:24:35.349289 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Jan 19 09:24:35.349442 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 19 09:24:35.349594 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 19 09:24:35.349728 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 19 09:24:35.349860 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 19 09:24:35.349992 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 19 09:24:35.350123 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 19 09:24:35.350255 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Jan 19 09:24:35.350397 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Jan 19 09:24:35.350555 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 19 09:24:35.350697 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Jan 19 09:24:35.350838 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Jan 19 09:24:35.350976 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 19 09:24:35.351117 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 19 09:24:35.351260 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Jan 19 09:24:35.351395 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 19 09:24:35.351557 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Jan 19 09:24:35.351693 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 19 09:24:35.351836 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Jan 19 09:24:35.351975 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Jan 19 09:24:35.352109 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 19 09:24:35.352249 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Jan 19 09:24:35.352383 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Jan 19 09:24:35.352586 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 19 09:24:35.352732 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Jan 19 09:24:35.352869 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Jan 19 09:24:35.353005 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 19 09:24:35.353014 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 19 09:24:35.353021 kernel: PCI: CLS 0 bytes, default 64 Jan 19 09:24:35.353027 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 19 09:24:35.353034 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Jan 19 09:24:35.353043 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jan 19 09:24:35.353049 kernel: Initialise system trusted keyrings Jan 19 09:24:35.353056 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 19 09:24:35.353062 kernel: Key type asymmetric registered Jan 19 09:24:35.353069 kernel: Asymmetric key parser 'x509' registered Jan 19 09:24:35.353075 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 19 09:24:35.353081 kernel: io scheduler mq-deadline registered Jan 19 09:24:35.353090 kernel: io scheduler kyber registered Jan 19 09:24:35.353096 kernel: io scheduler bfq registered Jan 19 09:24:35.353239 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 19 09:24:35.353382 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 19 09:24:35.353543 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 19 09:24:35.353685 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 19 09:24:35.353844 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 19 09:24:35.354022 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 19 09:24:35.354193 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 19 09:24:35.354345 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 19 09:24:35.354567 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 19 09:24:35.354740 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 19 09:24:35.354904 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 19 09:24:35.355063 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 19 09:24:35.355214 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 19 09:24:35.355355 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 19 09:24:35.355520 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 19 09:24:35.355665 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 19 09:24:35.355678 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 19 09:24:35.355851 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 19 09:24:35.356009 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 19 09:24:35.356018 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 19 09:24:35.356025 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 19 09:24:35.356032 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 19 09:24:35.356042 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 19 09:24:35.356057 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 19 09:24:35.356067 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 19 09:24:35.356077 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 19 09:24:35.356087 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 19 09:24:35.356268 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 19 09:24:35.356457 kernel: rtc_cmos 00:03: registered as rtc0 Jan 19 09:24:35.356638 kernel: rtc_cmos 00:03: setting system clock to 2026-01-19T09:24:33 UTC (1768814673) Jan 19 09:24:35.356813 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Jan 19 09:24:35.356831 kernel: amd_pstate: The CPPC feature is supported but currently disabled by the BIOS. Please enable it if your BIOS has the CPPC option. Jan 19 09:24:35.356841 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Jan 19 09:24:35.356851 kernel: efifb: probing for efifb Jan 19 09:24:35.356862 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 19 09:24:35.356875 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 19 09:24:35.356885 kernel: efifb: scrolling: redraw Jan 19 09:24:35.356891 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 19 09:24:35.356900 kernel: Console: switching to colour frame buffer device 160x50 Jan 19 09:24:35.356911 kernel: fb0: EFI VGA frame buffer device Jan 19 09:24:35.356921 kernel: pstore: Using crash dump compression: deflate Jan 19 09:24:35.356931 kernel: pstore: Registered efi_pstore as persistent store backend Jan 19 09:24:35.356941 kernel: NET: Registered PF_INET6 protocol family Jan 19 09:24:35.356953 kernel: Segment Routing with IPv6 Jan 19 09:24:35.356964 kernel: In-situ OAM (IOAM) with IPv6 Jan 19 09:24:35.356975 kernel: NET: Registered PF_PACKET protocol family Jan 19 09:24:35.356985 kernel: Key type dns_resolver registered Jan 19 09:24:35.356995 kernel: IPI shorthand broadcast: enabled Jan 19 09:24:35.357003 kernel: sched_clock: Marking stable (2015023455, 237662133)->(2285081074, -32395486) Jan 19 09:24:35.357013 kernel: registered taskstats version 1 Jan 19 09:24:35.357026 kernel: Loading compiled-in X.509 certificates Jan 19 09:24:35.357037 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: a9591db9912320a48a0589d0293fff3e535b90df' Jan 19 09:24:35.357047 kernel: Demotion targets for Node 0: null Jan 19 09:24:35.357057 kernel: Key type .fscrypt registered Jan 19 09:24:35.357066 kernel: Key type fscrypt-provisioning registered Jan 19 09:24:35.357073 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 19 09:24:35.357079 kernel: ima: Allocated hash algorithm: sha1 Jan 19 09:24:35.357088 kernel: ima: No architecture policies found Jan 19 09:24:35.357094 kernel: clk: Disabling unused clocks Jan 19 09:24:35.357100 kernel: Freeing unused kernel image (initmem) memory: 15536K Jan 19 09:24:35.357107 kernel: Write protecting the kernel read-only data: 47104k Jan 19 09:24:35.357113 kernel: Freeing unused kernel image (rodata/data gap) memory: 1124K Jan 19 09:24:35.357123 kernel: Run /init as init process Jan 19 09:24:35.357133 kernel: with arguments: Jan 19 09:24:35.357147 kernel: /init Jan 19 09:24:35.357157 kernel: with environment: Jan 19 09:24:35.357167 kernel: HOME=/ Jan 19 09:24:35.357174 kernel: TERM=linux Jan 19 09:24:35.357184 kernel: ACPI: bus type USB registered Jan 19 09:24:35.357195 kernel: usbcore: registered new interface driver usbfs Jan 19 09:24:35.357205 kernel: usbcore: registered new interface driver hub Jan 19 09:24:35.357218 kernel: usbcore: registered new device driver usb Jan 19 09:24:35.357590 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 19 09:24:35.357766 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 19 09:24:35.357935 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 19 09:24:35.358136 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 19 09:24:35.358322 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 19 09:24:35.358503 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 19 09:24:35.358747 kernel: hub 1-0:1.0: USB hub found Jan 19 09:24:35.358909 kernel: hub 1-0:1.0: 4 ports detected Jan 19 09:24:35.359075 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 19 09:24:35.359255 kernel: hub 2-0:1.0: USB hub found Jan 19 09:24:35.359414 kernel: hub 2-0:1.0: 4 ports detected Jan 19 09:24:35.359425 kernel: SCSI subsystem initialized Jan 19 09:24:35.359478 kernel: libata version 3.00 loaded. Jan 19 09:24:35.359631 kernel: ahci 0000:00:1f.2: version 3.0 Jan 19 09:24:35.359640 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 19 09:24:35.359782 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 19 09:24:35.359934 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 19 09:24:35.360106 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 19 09:24:35.360320 kernel: scsi host0: ahci Jan 19 09:24:35.360587 kernel: scsi host1: ahci Jan 19 09:24:35.360779 kernel: scsi host2: ahci Jan 19 09:24:35.360963 kernel: scsi host3: ahci Jan 19 09:24:35.361148 kernel: scsi host4: ahci Jan 19 09:24:35.361335 kernel: scsi host5: ahci Jan 19 09:24:35.361347 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 38 lpm-pol 1 Jan 19 09:24:35.361357 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 38 lpm-pol 1 Jan 19 09:24:35.361367 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 38 lpm-pol 1 Jan 19 09:24:35.361377 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 38 lpm-pol 1 Jan 19 09:24:35.361384 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 38 lpm-pol 1 Jan 19 09:24:35.361393 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 38 lpm-pol 1 Jan 19 09:24:35.361632 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 19 09:24:35.361644 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 19 09:24:35.361652 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 19 09:24:35.361658 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 19 09:24:35.361667 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 19 09:24:35.361682 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 19 09:24:35.361692 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Jan 19 09:24:35.361700 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 19 09:24:35.361707 kernel: ata1.00: LPM support broken, forcing max_power Jan 19 09:24:35.361714 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Jan 19 09:24:35.361720 kernel: ata1.00: applying bridge limits Jan 19 09:24:35.361728 kernel: ata1.00: LPM support broken, forcing max_power Jan 19 09:24:35.361737 kernel: ata1.00: configured for UDMA/100 Jan 19 09:24:35.361942 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Jan 19 09:24:35.361953 kernel: usbcore: registered new interface driver usbhid Jan 19 09:24:35.361960 kernel: usbhid: USB HID core driver Jan 19 09:24:35.362150 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jan 19 09:24:35.362341 kernel: scsi host6: Virtio SCSI HBA Jan 19 09:24:35.362572 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Jan 19 09:24:35.362776 kernel: scsi 6:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jan 19 09:24:35.362787 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 19 09:24:35.362974 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jan 19 09:24:35.363167 kernel: sd 6:0:0:0: Power-on or device reset occurred Jan 19 09:24:35.363178 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 19 09:24:35.363376 kernel: sd 6:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Jan 19 09:24:35.363646 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 19 09:24:35.363844 kernel: sd 6:0:0:0: [sda] Write Protect is off Jan 19 09:24:35.364039 kernel: sd 6:0:0:0: [sda] Mode Sense: 63 00 00 08 Jan 19 09:24:35.364228 kernel: sd 6:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jan 19 09:24:35.364237 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 19 09:24:35.364251 kernel: GPT:25804799 != 160006143 Jan 19 09:24:35.364261 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 19 09:24:35.364271 kernel: GPT:25804799 != 160006143 Jan 19 09:24:35.364280 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 19 09:24:35.364287 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 19 09:24:35.364563 kernel: sd 6:0:0:0: [sda] Attached SCSI disk Jan 19 09:24:35.364575 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 19 09:24:35.364586 kernel: device-mapper: uevent: version 1.0.3 Jan 19 09:24:35.364593 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 19 09:24:35.364603 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 19 09:24:35.364614 kernel: raid6: avx512x4 gen() 38206 MB/s Jan 19 09:24:35.364624 kernel: raid6: avx512x2 gen() 38846 MB/s Jan 19 09:24:35.364633 kernel: raid6: avx512x1 gen() 40839 MB/s Jan 19 09:24:35.364640 kernel: raid6: avx2x4 gen() 45036 MB/s Jan 19 09:24:35.364649 kernel: raid6: avx2x2 gen() 49012 MB/s Jan 19 09:24:35.364656 kernel: raid6: avx2x1 gen() 40261 MB/s Jan 19 09:24:35.364663 kernel: raid6: using algorithm avx2x2 gen() 49012 MB/s Jan 19 09:24:35.364669 kernel: raid6: .... xor() 34675 MB/s, rmw enabled Jan 19 09:24:35.364676 kernel: raid6: using avx512x2 recovery algorithm Jan 19 09:24:35.364686 kernel: xor: automatically using best checksumming function avx Jan 19 09:24:35.364696 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 19 09:24:35.364709 kernel: BTRFS: device fsid a5f82c06-1ff1-43b3-a650-214802f1359b devid 1 transid 35 /dev/mapper/usr (254:0) scanned by mount (184) Jan 19 09:24:35.364716 kernel: BTRFS info (device dm-0): first mount of filesystem a5f82c06-1ff1-43b3-a650-214802f1359b Jan 19 09:24:35.364723 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 19 09:24:35.364729 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 19 09:24:35.364736 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 19 09:24:35.364742 kernel: BTRFS info (device dm-0): enabling free space tree Jan 19 09:24:35.364749 kernel: loop: module loaded Jan 19 09:24:35.364757 kernel: loop0: detected capacity change from 0 to 100536 Jan 19 09:24:35.364764 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 19 09:24:35.364772 systemd[1]: Successfully made /usr/ read-only. Jan 19 09:24:35.364781 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 19 09:24:35.364788 systemd[1]: Detected virtualization kvm. Jan 19 09:24:35.364795 systemd[1]: Detected architecture x86-64. Jan 19 09:24:35.364804 systemd[1]: Running in initrd. Jan 19 09:24:35.364811 systemd[1]: No hostname configured, using default hostname. Jan 19 09:24:35.364818 systemd[1]: Hostname set to . Jan 19 09:24:35.364825 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 19 09:24:35.364835 systemd[1]: Queued start job for default target initrd.target. Jan 19 09:24:35.364861 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 19 09:24:35.364871 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 19 09:24:35.364878 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 19 09:24:35.364886 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 19 09:24:35.364894 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 19 09:24:35.364906 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 19 09:24:35.364917 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 19 09:24:35.364930 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 19 09:24:35.364939 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 19 09:24:35.364946 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 19 09:24:35.364956 systemd[1]: Reached target paths.target - Path Units. Jan 19 09:24:35.364968 systemd[1]: Reached target slices.target - Slice Units. Jan 19 09:24:35.364979 systemd[1]: Reached target swap.target - Swaps. Jan 19 09:24:35.364989 systemd[1]: Reached target timers.target - Timer Units. Jan 19 09:24:35.364998 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 19 09:24:35.365008 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 19 09:24:35.365020 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 19 09:24:35.365031 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 19 09:24:35.365041 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 19 09:24:35.365048 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 19 09:24:35.365055 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 19 09:24:35.365069 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 19 09:24:35.365079 systemd[1]: Reached target sockets.target - Socket Units. Jan 19 09:24:35.365110 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 19 09:24:35.365121 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 19 09:24:35.365132 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 19 09:24:35.365142 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 19 09:24:35.365154 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 19 09:24:35.365161 systemd[1]: Starting systemd-fsck-usr.service... Jan 19 09:24:35.365172 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 19 09:24:35.365183 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 19 09:24:35.365194 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 19 09:24:35.365205 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 19 09:24:35.365212 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 19 09:24:35.365222 systemd[1]: Finished systemd-fsck-usr.service. Jan 19 09:24:35.365233 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 19 09:24:35.365291 systemd-journald[321]: Collecting audit messages is enabled. Jan 19 09:24:35.365318 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 19 09:24:35.365330 kernel: Bridge firewalling registered Jan 19 09:24:35.365341 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 19 09:24:35.365352 kernel: audit: type=1130 audit(1768814675.345:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.365364 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 19 09:24:35.365371 systemd-journald[321]: Journal started Jan 19 09:24:35.365392 systemd-journald[321]: Runtime Journal (/run/log/journal/9ca4cedd972b488fa27399359e1155d9) is 8M, max 76M, 68M free. Jan 19 09:24:35.345000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.342107 systemd-modules-load[322]: Inserted module 'br_netfilter' Jan 19 09:24:35.370487 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 19 09:24:35.370000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.377837 kernel: audit: type=1130 audit(1768814675.370:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.377863 systemd[1]: Started systemd-journald.service - Journal Service. Jan 19 09:24:35.378000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.385925 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 19 09:24:35.388372 kernel: audit: type=1130 audit(1768814675.378:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.388391 kernel: audit: type=1130 audit(1768814675.386:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.392131 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 19 09:24:35.392000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.395552 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 19 09:24:35.398860 kernel: audit: type=1130 audit(1768814675.392:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.398000 audit: BPF prog-id=6 op=LOAD Jan 19 09:24:35.399938 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 19 09:24:35.401734 kernel: audit: type=1334 audit(1768814675.398:7): prog-id=6 op=LOAD Jan 19 09:24:35.412558 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 19 09:24:35.416102 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 19 09:24:35.427109 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 19 09:24:35.428000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.435504 kernel: audit: type=1130 audit(1768814675.428:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.437000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.437566 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 19 09:24:35.443829 kernel: audit: type=1130 audit(1768814675.437:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.445555 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 19 09:24:35.450049 systemd-tmpfiles[347]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 19 09:24:35.457802 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 19 09:24:35.458000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.463784 dracut-cmdline[360]: dracut-109 Jan 19 09:24:35.465137 kernel: audit: type=1130 audit(1768814675.458:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.465951 systemd-resolved[342]: Positive Trust Anchors: Jan 19 09:24:35.466563 systemd-resolved[342]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 19 09:24:35.466568 systemd-resolved[342]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 19 09:24:35.469032 dracut-cmdline[360]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e880b5400e832e1de59b993d9ba6b86a9089175f10b4985da8b7b47cc8c74099 Jan 19 09:24:35.466589 systemd-resolved[342]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 19 09:24:35.487909 systemd-resolved[342]: Defaulting to hostname 'linux'. Jan 19 09:24:35.488810 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 19 09:24:35.489000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.490177 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 19 09:24:35.545467 kernel: Loading iSCSI transport class v2.0-870. Jan 19 09:24:35.557558 kernel: iscsi: registered transport (tcp) Jan 19 09:24:35.576161 kernel: iscsi: registered transport (qla4xxx) Jan 19 09:24:35.576233 kernel: QLogic iSCSI HBA Driver Jan 19 09:24:35.605242 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 19 09:24:35.634974 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 19 09:24:35.635000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.638366 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 19 09:24:35.691687 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 19 09:24:35.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.693602 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 19 09:24:35.695701 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 19 09:24:35.741066 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 19 09:24:35.743000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.745000 audit: BPF prog-id=7 op=LOAD Jan 19 09:24:35.745000 audit: BPF prog-id=8 op=LOAD Jan 19 09:24:35.747723 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 19 09:24:35.783289 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 19 09:24:35.783000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.795342 systemd-udevd[628]: Using default interface naming scheme 'v257'. Jan 19 09:24:35.805535 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 19 09:24:35.809000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.812329 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 19 09:24:35.815000 audit: BPF prog-id=9 op=LOAD Jan 19 09:24:35.823585 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 19 09:24:35.837074 dracut-pre-trigger[694]: rd.md=0: removing MD RAID activation Jan 19 09:24:35.867421 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 19 09:24:35.867000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.869886 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 19 09:24:35.886235 systemd-networkd[695]: lo: Link UP Jan 19 09:24:35.886875 systemd-networkd[695]: lo: Gained carrier Jan 19 09:24:35.887426 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 19 09:24:35.888000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.888927 systemd[1]: Reached target network.target - Network. Jan 19 09:24:35.986407 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 19 09:24:35.989000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:35.995821 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 19 09:24:36.119337 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jan 19 09:24:36.133428 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jan 19 09:24:36.150547 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jan 19 09:24:36.158795 kernel: cryptd: max_cpu_qlen set to 1000 Jan 19 09:24:36.163206 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 19 09:24:36.171832 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 19 09:24:36.191077 disk-uuid[774]: Primary Header is updated. Jan 19 09:24:36.191077 disk-uuid[774]: Secondary Entries is updated. Jan 19 09:24:36.191077 disk-uuid[774]: Secondary Header is updated. Jan 19 09:24:36.207549 kernel: AES CTR mode by8 optimization enabled Jan 19 09:24:36.207794 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 19 09:24:36.208511 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 19 09:24:36.211000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:36.211704 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 19 09:24:36.223624 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Jan 19 09:24:36.223930 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 19 09:24:36.228269 systemd-networkd[695]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 19 09:24:36.228276 systemd-networkd[695]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 19 09:24:36.230929 systemd-networkd[695]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 19 09:24:36.230933 systemd-networkd[695]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 19 09:24:36.231664 systemd-networkd[695]: eth0: Link UP Jan 19 09:24:36.231800 systemd-networkd[695]: eth1: Link UP Jan 19 09:24:36.231948 systemd-networkd[695]: eth0: Gained carrier Jan 19 09:24:36.231956 systemd-networkd[695]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 19 09:24:36.237757 systemd-networkd[695]: eth1: Gained carrier Jan 19 09:24:36.237767 systemd-networkd[695]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 19 09:24:36.253619 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 19 09:24:36.253710 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 19 09:24:36.254000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:36.254000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:36.260604 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 19 09:24:36.265309 systemd-networkd[695]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 19 09:24:36.298498 systemd-networkd[695]: eth0: DHCPv4 address 77.42.19.180/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 19 09:24:36.301551 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 19 09:24:36.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:36.358821 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 19 09:24:36.358000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:36.359841 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 19 09:24:36.360256 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 19 09:24:36.361041 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 19 09:24:36.363493 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 19 09:24:36.381385 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 19 09:24:36.381000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.249927 disk-uuid[775]: Warning: The kernel is still using the old partition table. Jan 19 09:24:37.249927 disk-uuid[775]: The new table will be used at the next reboot or after you Jan 19 09:24:37.249927 disk-uuid[775]: run partprobe(8) or kpartx(8) Jan 19 09:24:37.249927 disk-uuid[775]: The operation has completed successfully. Jan 19 09:24:37.261515 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 19 09:24:37.288702 kernel: kauditd_printk_skb: 18 callbacks suppressed Jan 19 09:24:37.288747 kernel: audit: type=1130 audit(1768814677.262:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.288774 kernel: audit: type=1131 audit(1768814677.262:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.262000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.262000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.261753 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 19 09:24:37.264717 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 19 09:24:37.349512 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (898) Jan 19 09:24:37.349593 kernel: BTRFS info (device sda6): first mount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 19 09:24:37.355514 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 19 09:24:37.366348 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 19 09:24:37.366431 kernel: BTRFS info (device sda6): turning on async discard Jan 19 09:24:37.366494 kernel: BTRFS info (device sda6): enabling free space tree Jan 19 09:24:37.383538 kernel: BTRFS info (device sda6): last unmount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 19 09:24:37.386209 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 19 09:24:37.401564 kernel: audit: type=1130 audit(1768814677.387:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.387000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.390291 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 19 09:24:37.510944 systemd-networkd[695]: eth1: Gained IPv6LL Jan 19 09:24:37.560857 ignition[917]: Ignition 2.24.0 Jan 19 09:24:37.560868 ignition[917]: Stage: fetch-offline Jan 19 09:24:37.561065 ignition[917]: no configs at "/usr/lib/ignition/base.d" Jan 19 09:24:37.561076 ignition[917]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 19 09:24:37.561153 ignition[917]: parsed url from cmdline: "" Jan 19 09:24:37.561156 ignition[917]: no config URL provided Jan 19 09:24:37.561160 ignition[917]: reading system config file "/usr/lib/ignition/user.ign" Jan 19 09:24:37.561169 ignition[917]: no config at "/usr/lib/ignition/user.ign" Jan 19 09:24:37.561174 ignition[917]: failed to fetch config: resource requires networking Jan 19 09:24:37.563560 ignition[917]: Ignition finished successfully Jan 19 09:24:37.566054 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 19 09:24:37.571922 kernel: audit: type=1130 audit(1768814677.566:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.566000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.568581 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 19 09:24:37.590792 ignition[923]: Ignition 2.24.0 Jan 19 09:24:37.591386 ignition[923]: Stage: fetch Jan 19 09:24:37.591568 ignition[923]: no configs at "/usr/lib/ignition/base.d" Jan 19 09:24:37.591577 ignition[923]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 19 09:24:37.591652 ignition[923]: parsed url from cmdline: "" Jan 19 09:24:37.591657 ignition[923]: no config URL provided Jan 19 09:24:37.591661 ignition[923]: reading system config file "/usr/lib/ignition/user.ign" Jan 19 09:24:37.591668 ignition[923]: no config at "/usr/lib/ignition/user.ign" Jan 19 09:24:37.591689 ignition[923]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jan 19 09:24:37.596879 ignition[923]: GET result: OK Jan 19 09:24:37.597283 ignition[923]: parsing config with SHA512: f6b1db6a76ba982c6f81fc6b19d4c106088fb0733873b7d9f5175dcf6aa70423f1612d582ddf11db621cd107efd450a4b13ef4b4fc7d28a46a6cfdb027c7388d Jan 19 09:24:37.604751 unknown[923]: fetched base config from "system" Jan 19 09:24:37.604776 unknown[923]: fetched base config from "system" Jan 19 09:24:37.604789 unknown[923]: fetched user config from "hetzner" Jan 19 09:24:37.605194 ignition[923]: fetch: fetch complete Jan 19 09:24:37.605206 ignition[923]: fetch: fetch passed Jan 19 09:24:37.605304 ignition[923]: Ignition finished successfully Jan 19 09:24:37.610324 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 19 09:24:37.610000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.613569 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 19 09:24:37.624527 kernel: audit: type=1130 audit(1768814677.610:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.636620 systemd-networkd[695]: eth0: Gained IPv6LL Jan 19 09:24:37.672139 ignition[929]: Ignition 2.24.0 Jan 19 09:24:37.672158 ignition[929]: Stage: kargs Jan 19 09:24:37.672371 ignition[929]: no configs at "/usr/lib/ignition/base.d" Jan 19 09:24:37.672387 ignition[929]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 19 09:24:37.676217 ignition[929]: kargs: kargs passed Jan 19 09:24:37.676279 ignition[929]: Ignition finished successfully Jan 19 09:24:37.679690 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 19 09:24:37.686385 kernel: audit: type=1130 audit(1768814677.679:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.679000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.681972 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 19 09:24:37.717756 ignition[935]: Ignition 2.24.0 Jan 19 09:24:37.717768 ignition[935]: Stage: disks Jan 19 09:24:37.717893 ignition[935]: no configs at "/usr/lib/ignition/base.d" Jan 19 09:24:37.717901 ignition[935]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 19 09:24:37.722006 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 19 09:24:37.734091 kernel: audit: type=1130 audit(1768814677.722:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.722000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.718602 ignition[935]: disks: disks passed Jan 19 09:24:37.723276 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 19 09:24:37.718643 ignition[935]: Ignition finished successfully Jan 19 09:24:37.735018 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 19 09:24:37.736336 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 19 09:24:37.737610 systemd[1]: Reached target sysinit.target - System Initialization. Jan 19 09:24:37.739039 systemd[1]: Reached target basic.target - Basic System. Jan 19 09:24:37.742691 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 19 09:24:37.797174 systemd-fsck[943]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 19 09:24:37.801702 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 19 09:24:37.803000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.805717 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 19 09:24:37.820117 kernel: audit: type=1130 audit(1768814677.803:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:37.929485 kernel: EXT4-fs (sda9): mounted filesystem ec5ae8d3-548b-4a34-bd68-b1a953fcffb6 r/w with ordered data mode. Quota mode: none. Jan 19 09:24:37.930085 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 19 09:24:37.931062 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 19 09:24:37.933156 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 19 09:24:37.935492 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 19 09:24:37.937617 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jan 19 09:24:37.938932 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 19 09:24:37.938961 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 19 09:24:37.947116 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 19 09:24:37.952610 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 19 09:24:37.963464 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (951) Jan 19 09:24:37.969079 kernel: BTRFS info (device sda6): first mount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 19 09:24:37.969109 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 19 09:24:37.981901 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 19 09:24:37.981951 kernel: BTRFS info (device sda6): turning on async discard Jan 19 09:24:37.981961 kernel: BTRFS info (device sda6): enabling free space tree Jan 19 09:24:37.987972 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 19 09:24:38.026453 coreos-metadata[953]: Jan 19 09:24:38.026 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jan 19 09:24:38.028287 coreos-metadata[953]: Jan 19 09:24:38.027 INFO Fetch successful Jan 19 09:24:38.029509 coreos-metadata[953]: Jan 19 09:24:38.028 INFO wrote hostname ci-4580-0-0-p-637e605ed7 to /sysroot/etc/hostname Jan 19 09:24:38.032104 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 19 09:24:38.032000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:38.039533 kernel: audit: type=1130 audit(1768814678.032:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:38.137113 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 19 09:24:38.137000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:38.138714 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 19 09:24:38.145515 kernel: audit: type=1130 audit(1768814678.137:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:38.149591 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 19 09:24:38.161467 kernel: BTRFS info (device sda6): last unmount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 19 09:24:38.183470 ignition[1053]: INFO : Ignition 2.24.0 Jan 19 09:24:38.183470 ignition[1053]: INFO : Stage: mount Jan 19 09:24:38.186150 ignition[1053]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 19 09:24:38.186150 ignition[1053]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 19 09:24:38.186150 ignition[1053]: INFO : mount: mount passed Jan 19 09:24:38.186150 ignition[1053]: INFO : Ignition finished successfully Jan 19 09:24:38.185000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:38.185815 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 19 09:24:38.187606 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 19 09:24:38.191000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:38.189738 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 19 09:24:38.323563 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 19 09:24:38.325540 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 19 09:24:38.353986 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1064) Jan 19 09:24:38.354046 kernel: BTRFS info (device sda6): first mount of filesystem 984b7cbf-e15c-4ac8-8ab0-1fb2c55516eb Jan 19 09:24:38.359926 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 19 09:24:38.370608 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 19 09:24:38.370680 kernel: BTRFS info (device sda6): turning on async discard Jan 19 09:24:38.372551 kernel: BTRFS info (device sda6): enabling free space tree Jan 19 09:24:38.378334 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 19 09:24:38.416708 ignition[1080]: INFO : Ignition 2.24.0 Jan 19 09:24:38.416708 ignition[1080]: INFO : Stage: files Jan 19 09:24:38.418094 ignition[1080]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 19 09:24:38.418094 ignition[1080]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 19 09:24:38.419227 ignition[1080]: DEBUG : files: compiled without relabeling support, skipping Jan 19 09:24:38.420529 ignition[1080]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 19 09:24:38.420529 ignition[1080]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 19 09:24:38.425642 ignition[1080]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 19 09:24:38.426145 ignition[1080]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 19 09:24:38.426904 unknown[1080]: wrote ssh authorized keys file for user: core Jan 19 09:24:38.427480 ignition[1080]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 19 09:24:38.430228 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 19 09:24:38.430940 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Jan 19 09:24:38.577930 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 19 09:24:38.883649 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jan 19 09:24:38.883649 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 19 09:24:38.887647 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 19 09:24:38.887647 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 19 09:24:38.887647 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 19 09:24:38.887647 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 19 09:24:38.887647 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 19 09:24:38.887647 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 19 09:24:38.887647 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 19 09:24:38.887647 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 19 09:24:38.894380 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 19 09:24:38.894380 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 19 09:24:38.894380 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 19 09:24:38.894380 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 19 09:24:38.894380 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-x86-64.raw: attempt #1 Jan 19 09:24:39.469677 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 19 09:24:39.889629 ignition[1080]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Jan 19 09:24:39.889629 ignition[1080]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 19 09:24:39.892414 ignition[1080]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 19 09:24:39.895245 ignition[1080]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 19 09:24:39.895245 ignition[1080]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 19 09:24:39.895245 ignition[1080]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jan 19 09:24:39.898767 ignition[1080]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 19 09:24:39.898767 ignition[1080]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 19 09:24:39.898767 ignition[1080]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jan 19 09:24:39.898767 ignition[1080]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jan 19 09:24:39.898767 ignition[1080]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jan 19 09:24:39.898767 ignition[1080]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 19 09:24:39.898767 ignition[1080]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 19 09:24:39.898767 ignition[1080]: INFO : files: files passed Jan 19 09:24:39.898767 ignition[1080]: INFO : Ignition finished successfully Jan 19 09:24:39.901000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:39.900293 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 19 09:24:39.905706 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 19 09:24:39.915739 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 19 09:24:39.926014 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 19 09:24:39.931682 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 19 09:24:39.933000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:39.933000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:39.955310 initrd-setup-root-after-ignition[1112]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 19 09:24:39.955310 initrd-setup-root-after-ignition[1112]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 19 09:24:39.958560 initrd-setup-root-after-ignition[1116]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 19 09:24:39.960806 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 19 09:24:39.961000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:39.963219 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 19 09:24:39.965944 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 19 09:24:40.043806 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 19 09:24:40.044102 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 19 09:24:40.046000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.046000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.047585 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 19 09:24:40.049490 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 19 09:24:40.053188 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 19 09:24:40.055584 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 19 09:24:40.109927 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 19 09:24:40.110000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.115995 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 19 09:24:40.153656 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 19 09:24:40.155117 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 19 09:24:40.156963 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 19 09:24:40.159016 systemd[1]: Stopped target timers.target - Timer Units. Jan 19 09:24:40.161148 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 19 09:24:40.162000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.161402 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 19 09:24:40.163988 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 19 09:24:40.166012 systemd[1]: Stopped target basic.target - Basic System. Jan 19 09:24:40.168239 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 19 09:24:40.170190 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 19 09:24:40.172032 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 19 09:24:40.174095 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 19 09:24:40.176109 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 19 09:24:40.178245 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 19 09:24:40.180642 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 19 09:24:40.182978 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 19 09:24:40.184902 systemd[1]: Stopped target swap.target - Swaps. Jan 19 09:24:40.186778 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 19 09:24:40.189000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.186979 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 19 09:24:40.190985 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 19 09:24:40.193263 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 19 09:24:40.195135 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 19 09:24:40.195359 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 19 09:24:40.199000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.197252 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 19 09:24:40.197548 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 19 09:24:40.202000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.200633 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 19 09:24:40.204000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.201002 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 19 09:24:40.206000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.203062 systemd[1]: ignition-files.service: Deactivated successfully. Jan 19 09:24:40.203295 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 19 09:24:40.205243 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jan 19 09:24:40.205543 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 19 09:24:40.210820 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 19 09:24:40.212089 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 19 09:24:40.216000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.214724 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 19 09:24:40.221288 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 19 09:24:40.224832 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 19 09:24:40.229000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.225243 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 19 09:24:40.229903 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 19 09:24:40.234000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.230248 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 19 09:24:40.234926 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 19 09:24:40.243000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.235191 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 19 09:24:40.263037 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 19 09:24:40.264319 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 19 09:24:40.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.267000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.273694 ignition[1136]: INFO : Ignition 2.24.0 Jan 19 09:24:40.276385 ignition[1136]: INFO : Stage: umount Jan 19 09:24:40.276385 ignition[1136]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 19 09:24:40.276385 ignition[1136]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 19 09:24:40.276385 ignition[1136]: INFO : umount: umount passed Jan 19 09:24:40.276385 ignition[1136]: INFO : Ignition finished successfully Jan 19 09:24:40.282806 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 19 09:24:40.283713 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 19 09:24:40.285000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.286141 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 19 09:24:40.286220 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 19 09:24:40.287000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.288536 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 19 09:24:40.288612 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 19 09:24:40.290000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.290621 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 19 09:24:40.291000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.290704 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 19 09:24:40.291683 systemd[1]: Stopped target network.target - Network. Jan 19 09:24:40.293000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.293088 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 19 09:24:40.293174 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 19 09:24:40.294396 systemd[1]: Stopped target paths.target - Path Units. Jan 19 09:24:40.296380 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 19 09:24:40.296548 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 19 09:24:40.297420 systemd[1]: Stopped target slices.target - Slice Units. Jan 19 09:24:40.298532 systemd[1]: Stopped target sockets.target - Socket Units. Jan 19 09:24:40.299652 systemd[1]: iscsid.socket: Deactivated successfully. Jan 19 09:24:40.304000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.299742 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 19 09:24:40.305000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.300963 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 19 09:24:40.301044 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 19 09:24:40.302228 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 19 09:24:40.302276 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 19 09:24:40.303415 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 19 09:24:40.311000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.303568 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 19 09:24:40.304628 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 19 09:24:40.313000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.304720 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 19 09:24:40.305719 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 19 09:24:40.306776 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 19 09:24:40.309676 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 19 09:24:40.310825 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 19 09:24:40.311033 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 19 09:24:40.312417 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 19 09:24:40.312686 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 19 09:24:40.317000 audit: BPF prog-id=6 op=UNLOAD Jan 19 09:24:40.316930 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 19 09:24:40.317000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.317063 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 19 09:24:40.320853 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 19 09:24:40.321065 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 19 09:24:40.321000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.325000 audit: BPF prog-id=9 op=UNLOAD Jan 19 09:24:40.325497 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 19 09:24:40.326334 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 19 09:24:40.326404 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 19 09:24:40.328783 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 19 09:24:40.329000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.329590 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 19 09:24:40.330000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.329677 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 19 09:24:40.331000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.330526 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 19 09:24:40.330621 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 19 09:24:40.331542 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 19 09:24:40.331614 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 19 09:24:40.332608 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 19 09:24:40.349629 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 19 09:24:40.351696 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 19 09:24:40.352000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.353684 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 19 09:24:40.353775 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 19 09:24:40.354886 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 19 09:24:40.354944 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 19 09:24:40.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.355882 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 19 09:24:40.355965 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 19 09:24:40.357000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.357215 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 19 09:24:40.357286 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 19 09:24:40.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.358642 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 19 09:24:40.358717 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 19 09:24:40.361254 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 19 09:24:40.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.362008 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 19 09:24:40.363000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.362093 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 19 09:24:40.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.362898 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 19 09:24:40.362972 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 19 09:24:40.363988 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 19 09:24:40.364063 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 19 09:24:40.366030 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 19 09:24:40.366210 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 19 09:24:40.368000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.385266 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 19 09:24:40.385560 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 19 09:24:40.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.386000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:40.387232 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 19 09:24:40.390060 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 19 09:24:40.416381 systemd[1]: Switching root. Jan 19 09:24:40.460205 systemd-journald[321]: Journal stopped Jan 19 09:24:41.727086 systemd-journald[321]: Received SIGTERM from PID 1 (systemd). Jan 19 09:24:41.727169 kernel: SELinux: policy capability network_peer_controls=1 Jan 19 09:24:41.727186 kernel: SELinux: policy capability open_perms=1 Jan 19 09:24:41.727200 kernel: SELinux: policy capability extended_socket_class=1 Jan 19 09:24:41.727215 kernel: SELinux: policy capability always_check_network=0 Jan 19 09:24:41.727228 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 19 09:24:41.727245 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 19 09:24:41.727263 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 19 09:24:41.727281 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 19 09:24:41.727293 kernel: SELinux: policy capability userspace_initial_context=0 Jan 19 09:24:41.727307 systemd[1]: Successfully loaded SELinux policy in 80.268ms. Jan 19 09:24:41.727336 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.836ms. Jan 19 09:24:41.727353 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 19 09:24:41.727368 systemd[1]: Detected virtualization kvm. Jan 19 09:24:41.727383 systemd[1]: Detected architecture x86-64. Jan 19 09:24:41.727397 systemd[1]: Detected first boot. Jan 19 09:24:41.727415 systemd[1]: Hostname set to . Jan 19 09:24:41.727429 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 19 09:24:41.727456 zram_generator::config[1179]: No configuration found. Jan 19 09:24:41.727488 kernel: Guest personality initialized and is inactive Jan 19 09:24:41.727502 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 19 09:24:41.727516 kernel: Initialized host personality Jan 19 09:24:41.727530 kernel: NET: Registered PF_VSOCK protocol family Jan 19 09:24:41.727544 systemd[1]: Populated /etc with preset unit settings. Jan 19 09:24:41.727568 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 19 09:24:41.727594 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 19 09:24:41.727612 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 19 09:24:41.727635 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 19 09:24:41.727650 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 19 09:24:41.727668 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 19 09:24:41.727683 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 19 09:24:41.727701 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 19 09:24:41.727716 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 19 09:24:41.727735 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 19 09:24:41.727749 systemd[1]: Created slice user.slice - User and Session Slice. Jan 19 09:24:41.727763 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 19 09:24:41.727779 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 19 09:24:41.727794 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 19 09:24:41.727812 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 19 09:24:41.727827 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 19 09:24:41.727844 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 19 09:24:41.727861 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 19 09:24:41.727875 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 19 09:24:41.727890 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 19 09:24:41.727907 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 19 09:24:41.727922 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 19 09:24:41.727937 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 19 09:24:41.727951 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 19 09:24:41.727966 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 19 09:24:41.727980 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 19 09:24:41.727995 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 19 09:24:41.728012 systemd[1]: Reached target slices.target - Slice Units. Jan 19 09:24:41.728027 systemd[1]: Reached target swap.target - Swaps. Jan 19 09:24:41.728041 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 19 09:24:41.728056 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 19 09:24:41.728070 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 19 09:24:41.728084 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 19 09:24:41.728098 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 19 09:24:41.728115 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 19 09:24:41.728129 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 19 09:24:41.728144 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 19 09:24:41.728159 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 19 09:24:41.728173 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 19 09:24:41.728188 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 19 09:24:41.728202 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 19 09:24:41.728219 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 19 09:24:41.728233 systemd[1]: Mounting media.mount - External Media Directory... Jan 19 09:24:41.728248 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:41.728268 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 19 09:24:41.728283 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 19 09:24:41.728296 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 19 09:24:41.728311 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 19 09:24:41.728328 systemd[1]: Reached target machines.target - Containers. Jan 19 09:24:41.728342 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 19 09:24:41.728356 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 19 09:24:41.728370 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 19 09:24:41.728385 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 19 09:24:41.728399 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 19 09:24:41.728413 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 19 09:24:41.728547 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 19 09:24:41.728564 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 19 09:24:41.728578 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 19 09:24:41.728594 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 19 09:24:41.728609 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 19 09:24:41.728624 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 19 09:24:41.728641 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 19 09:24:41.728655 systemd[1]: Stopped systemd-fsck-usr.service. Jan 19 09:24:41.728670 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 19 09:24:41.728684 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 19 09:24:41.728702 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 19 09:24:41.728716 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 19 09:24:41.728732 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 19 09:24:41.728746 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 19 09:24:41.728761 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 19 09:24:41.728776 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:41.728790 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 19 09:24:41.728807 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 19 09:24:41.728824 systemd[1]: Mounted media.mount - External Media Directory. Jan 19 09:24:41.728838 kernel: ACPI: bus type drm_connector registered Jan 19 09:24:41.728854 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 19 09:24:41.728871 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 19 09:24:41.728885 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 19 09:24:41.728899 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 19 09:24:41.728914 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 19 09:24:41.728929 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 19 09:24:41.728943 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 19 09:24:41.728957 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 19 09:24:41.728973 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 19 09:24:41.728987 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 19 09:24:41.729002 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 19 09:24:41.729016 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 19 09:24:41.729030 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 19 09:24:41.729045 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 19 09:24:41.729084 systemd-journald[1248]: Collecting audit messages is enabled. Jan 19 09:24:41.729118 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 19 09:24:41.729131 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 19 09:24:41.729148 kernel: fuse: init (API version 7.41) Jan 19 09:24:41.729162 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 19 09:24:41.729179 systemd-journald[1248]: Journal started Jan 19 09:24:41.729204 systemd-journald[1248]: Runtime Journal (/run/log/journal/9ca4cedd972b488fa27399359e1155d9) is 8M, max 76M, 68M free. Jan 19 09:24:41.446000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 19 09:24:41.587000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.594000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.600000 audit: BPF prog-id=14 op=UNLOAD Jan 19 09:24:41.600000 audit: BPF prog-id=13 op=UNLOAD Jan 19 09:24:41.601000 audit: BPF prog-id=15 op=LOAD Jan 19 09:24:41.601000 audit: BPF prog-id=16 op=LOAD Jan 19 09:24:41.601000 audit: BPF prog-id=17 op=LOAD Jan 19 09:24:41.681000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.687000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.687000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.694000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.700000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.700000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.705000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.705000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.714000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.714000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.717000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.721000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.722000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 19 09:24:41.722000 audit[1248]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=6 a1=7ffcc5d2e240 a2=4000 a3=0 items=0 ppid=1 pid=1248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:24:41.722000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 19 09:24:41.729000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.333996 systemd[1]: Queued start job for default target multi-user.target. Jan 19 09:24:41.360343 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 19 09:24:41.360936 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 19 09:24:41.732465 systemd[1]: Started systemd-journald.service - Journal Service. Jan 19 09:24:41.733000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.736000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.738000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.738000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.736568 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 19 09:24:41.738012 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 19 09:24:41.738168 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 19 09:24:41.750945 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 19 09:24:41.753038 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 19 09:24:41.756543 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 19 09:24:41.759528 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 19 09:24:41.759913 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 19 09:24:41.759936 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 19 09:24:41.761326 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 19 09:24:41.762344 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 19 09:24:41.762528 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 19 09:24:41.768246 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 19 09:24:41.772682 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 19 09:24:41.773305 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 19 09:24:41.775653 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 19 09:24:41.776209 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 19 09:24:41.777712 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 19 09:24:41.779991 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 19 09:24:41.783000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.782378 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 19 09:24:41.785641 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 19 09:24:41.786252 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 19 09:24:41.793612 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 19 09:24:41.801000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.799642 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 19 09:24:41.801707 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 19 09:24:41.805589 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 19 09:24:41.835923 systemd-journald[1248]: Time spent on flushing to /var/log/journal/9ca4cedd972b488fa27399359e1155d9 is 29.502ms for 1368 entries. Jan 19 09:24:41.835923 systemd-journald[1248]: System Journal (/var/log/journal/9ca4cedd972b488fa27399359e1155d9) is 8M, max 588.1M, 580.1M free. Jan 19 09:24:41.878569 systemd-journald[1248]: Received client request to flush runtime journal. Jan 19 09:24:41.878603 kernel: loop1: detected capacity change from 0 to 8 Jan 19 09:24:41.878616 kernel: loop2: detected capacity change from 0 to 50784 Jan 19 09:24:41.837000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.845000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.860000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.880000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.836527 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 19 09:24:41.843885 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 19 09:24:41.860690 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 19 09:24:41.880889 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 19 09:24:41.884568 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 19 09:24:41.884000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.886000 audit: BPF prog-id=18 op=LOAD Jan 19 09:24:41.886000 audit: BPF prog-id=19 op=LOAD Jan 19 09:24:41.886000 audit: BPF prog-id=20 op=LOAD Jan 19 09:24:41.887598 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 19 09:24:41.890000 audit: BPF prog-id=21 op=LOAD Jan 19 09:24:41.891563 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 19 09:24:41.894292 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 19 09:24:41.900000 audit: BPF prog-id=22 op=LOAD Jan 19 09:24:41.900000 audit: BPF prog-id=23 op=LOAD Jan 19 09:24:41.900000 audit: BPF prog-id=24 op=LOAD Jan 19 09:24:41.905456 kernel: loop3: detected capacity change from 0 to 219144 Jan 19 09:24:41.904621 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 19 09:24:41.905000 audit: BPF prog-id=25 op=LOAD Jan 19 09:24:41.905000 audit: BPF prog-id=26 op=LOAD Jan 19 09:24:41.905000 audit: BPF prog-id=27 op=LOAD Jan 19 09:24:41.908570 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 19 09:24:41.942529 systemd-tmpfiles[1326]: ACLs are not supported, ignoring. Jan 19 09:24:41.942542 systemd-tmpfiles[1326]: ACLs are not supported, ignoring. Jan 19 09:24:41.949000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.949531 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 19 09:24:41.953567 kernel: loop4: detected capacity change from 0 to 111560 Jan 19 09:24:41.966000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.966086 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 19 09:24:41.981104 systemd-nsresourced[1327]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 19 09:24:41.989226 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 19 09:24:41.990000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:41.994504 kernel: loop5: detected capacity change from 0 to 8 Jan 19 09:24:42.001459 kernel: loop6: detected capacity change from 0 to 50784 Jan 19 09:24:42.020456 kernel: loop7: detected capacity change from 0 to 219144 Jan 19 09:24:42.042468 kernel: loop1: detected capacity change from 0 to 111560 Jan 19 09:24:42.055824 (sd-merge)[1345]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Jan 19 09:24:42.061240 (sd-merge)[1345]: Merged extensions into '/usr'. Jan 19 09:24:42.067616 systemd[1]: Reload requested from client PID 1304 ('systemd-sysext') (unit systemd-sysext.service)... Jan 19 09:24:42.067706 systemd[1]: Reloading... Jan 19 09:24:42.074617 systemd-oomd[1324]: No swap; memory pressure usage will be degraded Jan 19 09:24:42.085720 systemd-resolved[1325]: Positive Trust Anchors: Jan 19 09:24:42.085949 systemd-resolved[1325]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 19 09:24:42.086005 systemd-resolved[1325]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 19 09:24:42.086049 systemd-resolved[1325]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 19 09:24:42.103221 systemd-resolved[1325]: Using system hostname 'ci-4580-0-0-p-637e605ed7'. Jan 19 09:24:42.137467 zram_generator::config[1378]: No configuration found. Jan 19 09:24:42.289876 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 19 09:24:42.290490 systemd[1]: Reloading finished in 221 ms. Jan 19 09:24:42.323149 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 19 09:24:42.328168 kernel: kauditd_printk_skb: 112 callbacks suppressed Jan 19 09:24:42.328276 kernel: audit: type=1130 audit(1768814682.323:149): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.323000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.328133 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 19 09:24:42.329000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.330257 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 19 09:24:42.333454 kernel: audit: type=1130 audit(1768814682.329:150): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.333000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.334951 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 19 09:24:42.338459 kernel: audit: type=1130 audit(1768814682.333:151): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.338000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.343455 kernel: audit: type=1130 audit(1768814682.338:152): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.346754 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 19 09:24:42.360991 systemd[1]: Starting ensure-sysext.service... Jan 19 09:24:42.366727 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 19 09:24:42.368000 audit: BPF prog-id=8 op=UNLOAD Jan 19 09:24:42.373512 kernel: audit: type=1334 audit(1768814682.368:153): prog-id=8 op=UNLOAD Jan 19 09:24:42.373564 kernel: audit: type=1334 audit(1768814682.368:154): prog-id=7 op=UNLOAD Jan 19 09:24:42.373589 kernel: audit: type=1334 audit(1768814682.369:155): prog-id=28 op=LOAD Jan 19 09:24:42.373611 kernel: audit: type=1334 audit(1768814682.369:156): prog-id=29 op=LOAD Jan 19 09:24:42.368000 audit: BPF prog-id=7 op=UNLOAD Jan 19 09:24:42.369000 audit: BPF prog-id=28 op=LOAD Jan 19 09:24:42.369000 audit: BPF prog-id=29 op=LOAD Jan 19 09:24:42.372718 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 19 09:24:42.388000 audit: BPF prog-id=30 op=LOAD Jan 19 09:24:42.394496 kernel: audit: type=1334 audit(1768814682.388:157): prog-id=30 op=LOAD Jan 19 09:24:42.388000 audit: BPF prog-id=25 op=UNLOAD Jan 19 09:24:42.399506 kernel: audit: type=1334 audit(1768814682.388:158): prog-id=25 op=UNLOAD Jan 19 09:24:42.388000 audit: BPF prog-id=31 op=LOAD Jan 19 09:24:42.388000 audit: BPF prog-id=32 op=LOAD Jan 19 09:24:42.388000 audit: BPF prog-id=26 op=UNLOAD Jan 19 09:24:42.388000 audit: BPF prog-id=27 op=UNLOAD Jan 19 09:24:42.393000 audit: BPF prog-id=33 op=LOAD Jan 19 09:24:42.393000 audit: BPF prog-id=15 op=UNLOAD Jan 19 09:24:42.393000 audit: BPF prog-id=34 op=LOAD Jan 19 09:24:42.393000 audit: BPF prog-id=35 op=LOAD Jan 19 09:24:42.393000 audit: BPF prog-id=16 op=UNLOAD Jan 19 09:24:42.393000 audit: BPF prog-id=17 op=UNLOAD Jan 19 09:24:42.399000 audit: BPF prog-id=36 op=LOAD Jan 19 09:24:42.399000 audit: BPF prog-id=21 op=UNLOAD Jan 19 09:24:42.400000 audit: BPF prog-id=37 op=LOAD Jan 19 09:24:42.400000 audit: BPF prog-id=22 op=UNLOAD Jan 19 09:24:42.401000 audit: BPF prog-id=38 op=LOAD Jan 19 09:24:42.407000 audit: BPF prog-id=39 op=LOAD Jan 19 09:24:42.407000 audit: BPF prog-id=23 op=UNLOAD Jan 19 09:24:42.407000 audit: BPF prog-id=24 op=UNLOAD Jan 19 09:24:42.408000 audit: BPF prog-id=40 op=LOAD Jan 19 09:24:42.408000 audit: BPF prog-id=18 op=UNLOAD Jan 19 09:24:42.408000 audit: BPF prog-id=41 op=LOAD Jan 19 09:24:42.408000 audit: BPF prog-id=42 op=LOAD Jan 19 09:24:42.408000 audit: BPF prog-id=19 op=UNLOAD Jan 19 09:24:42.408000 audit: BPF prog-id=20 op=UNLOAD Jan 19 09:24:42.418560 systemd[1]: Reload requested from client PID 1421 ('systemctl') (unit ensure-sysext.service)... Jan 19 09:24:42.418572 systemd[1]: Reloading... Jan 19 09:24:42.421006 systemd-tmpfiles[1422]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 19 09:24:42.421079 systemd-tmpfiles[1422]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 19 09:24:42.421757 systemd-tmpfiles[1422]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 19 09:24:42.424652 systemd-tmpfiles[1422]: ACLs are not supported, ignoring. Jan 19 09:24:42.424900 systemd-tmpfiles[1422]: ACLs are not supported, ignoring. Jan 19 09:24:42.446971 systemd-tmpfiles[1422]: Detected autofs mount point /boot during canonicalization of boot. Jan 19 09:24:42.446996 systemd-tmpfiles[1422]: Skipping /boot Jan 19 09:24:42.474494 systemd-udevd[1423]: Using default interface naming scheme 'v257'. Jan 19 09:24:42.479249 systemd-tmpfiles[1422]: Detected autofs mount point /boot during canonicalization of boot. Jan 19 09:24:42.479263 systemd-tmpfiles[1422]: Skipping /boot Jan 19 09:24:42.499494 zram_generator::config[1455]: No configuration found. Jan 19 09:24:42.640671 kernel: mousedev: PS/2 mouse device common for all mice Jan 19 09:24:42.696454 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 19 09:24:42.704459 kernel: ACPI: button: Power Button [PWRF] Jan 19 09:24:42.744393 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 19 09:24:42.744600 systemd[1]: Reloading finished in 325 ms. Jan 19 09:24:42.754272 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 19 09:24:42.755485 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 19 09:24:42.755000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.756203 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 19 09:24:42.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.760284 kernel: Console: switching to colour dummy device 80x25 Jan 19 09:24:42.766518 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 19 09:24:42.766798 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 19 09:24:42.766824 kernel: [drm] features: -context_init Jan 19 09:24:42.766836 kernel: [drm] number of scanouts: 1 Jan 19 09:24:42.766847 kernel: [drm] number of cap sets: 0 Jan 19 09:24:42.766857 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 19 09:24:42.763000 audit: BPF prog-id=43 op=LOAD Jan 19 09:24:42.763000 audit: BPF prog-id=33 op=UNLOAD Jan 19 09:24:42.764000 audit: BPF prog-id=44 op=LOAD Jan 19 09:24:42.764000 audit: BPF prog-id=45 op=LOAD Jan 19 09:24:42.764000 audit: BPF prog-id=34 op=UNLOAD Jan 19 09:24:42.764000 audit: BPF prog-id=35 op=UNLOAD Jan 19 09:24:42.765000 audit: BPF prog-id=46 op=LOAD Jan 19 09:24:42.765000 audit: BPF prog-id=40 op=UNLOAD Jan 19 09:24:42.765000 audit: BPF prog-id=47 op=LOAD Jan 19 09:24:42.765000 audit: BPF prog-id=48 op=LOAD Jan 19 09:24:42.765000 audit: BPF prog-id=41 op=UNLOAD Jan 19 09:24:42.765000 audit: BPF prog-id=42 op=UNLOAD Jan 19 09:24:42.766000 audit: BPF prog-id=49 op=LOAD Jan 19 09:24:42.766000 audit: BPF prog-id=30 op=UNLOAD Jan 19 09:24:42.766000 audit: BPF prog-id=50 op=LOAD Jan 19 09:24:42.766000 audit: BPF prog-id=51 op=LOAD Jan 19 09:24:42.766000 audit: BPF prog-id=31 op=UNLOAD Jan 19 09:24:42.766000 audit: BPF prog-id=32 op=UNLOAD Jan 19 09:24:42.767000 audit: BPF prog-id=52 op=LOAD Jan 19 09:24:42.768000 audit: BPF prog-id=53 op=LOAD Jan 19 09:24:42.768000 audit: BPF prog-id=28 op=UNLOAD Jan 19 09:24:42.768000 audit: BPF prog-id=29 op=UNLOAD Jan 19 09:24:42.771491 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 19 09:24:42.771529 kernel: Console: switching to colour frame buffer device 160x50 Jan 19 09:24:42.769000 audit: BPF prog-id=54 op=LOAD Jan 19 09:24:42.769000 audit: BPF prog-id=37 op=UNLOAD Jan 19 09:24:42.769000 audit: BPF prog-id=55 op=LOAD Jan 19 09:24:42.769000 audit: BPF prog-id=56 op=LOAD Jan 19 09:24:42.769000 audit: BPF prog-id=38 op=UNLOAD Jan 19 09:24:42.769000 audit: BPF prog-id=39 op=UNLOAD Jan 19 09:24:42.770000 audit: BPF prog-id=57 op=LOAD Jan 19 09:24:42.771000 audit: BPF prog-id=36 op=UNLOAD Jan 19 09:24:42.778460 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 19 09:24:42.786463 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 19 09:24:42.786690 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 19 09:24:42.792453 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 19 09:24:42.797805 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jan 19 09:24:42.809210 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:42.811654 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 19 09:24:42.814777 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 19 09:24:42.815687 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 19 09:24:42.817299 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 19 09:24:42.821641 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 19 09:24:42.827736 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 19 09:24:42.828890 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 19 09:24:42.829049 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 19 09:24:42.831168 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 19 09:24:42.831988 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 19 09:24:42.834644 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 19 09:24:42.843004 kernel: EDAC MC: Ver: 3.0.0 Jan 19 09:24:42.841000 audit: BPF prog-id=58 op=LOAD Jan 19 09:24:42.843906 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 19 09:24:42.846873 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 19 09:24:42.847763 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:42.850401 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 19 09:24:42.862528 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 19 09:24:42.864000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.864000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.874928 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 19 09:24:42.879323 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:42.879510 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 19 09:24:42.890356 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 19 09:24:42.891346 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 19 09:24:42.891534 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 19 09:24:42.901000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.902000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.902000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.898077 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 19 09:24:42.899519 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 19 09:24:42.899618 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:42.900843 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 19 09:24:42.902066 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 19 09:24:42.902285 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 19 09:24:42.908546 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 19 09:24:42.912933 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:42.913518 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 19 09:24:42.925733 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 19 09:24:42.927330 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 19 09:24:42.927657 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 19 09:24:42.927863 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 19 09:24:42.927928 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 19 09:24:42.927971 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:42.931000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.931000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.931111 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 19 09:24:42.931340 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 19 09:24:42.939000 audit[1555]: SYSTEM_BOOT pid=1555 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.949147 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:42.949342 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 19 09:24:42.954092 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 19 09:24:42.958784 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 19 09:24:42.961733 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 19 09:24:42.966000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.962460 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 19 09:24:42.962860 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 19 09:24:42.962986 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 19 09:24:42.963569 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 19 09:24:42.965944 systemd[1]: Finished ensure-sysext.service. Jan 19 09:24:42.970000 audit: BPF prog-id=59 op=LOAD Jan 19 09:24:42.974959 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 19 09:24:42.979000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:42.979213 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 19 09:24:43.006716 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 19 09:24:43.021168 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 19 09:24:43.021000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.040391 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 19 09:24:43.040673 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 19 09:24:43.045000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.045000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.045884 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 19 09:24:43.046076 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 19 09:24:43.046000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.046000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.046721 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 19 09:24:43.046893 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 19 09:24:43.046000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.046000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.048691 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 19 09:24:43.048952 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 19 09:24:43.048000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.048000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.057107 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 19 09:24:43.057161 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 19 09:24:43.087000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.086512 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 19 09:24:43.089642 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 19 09:24:43.090000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.090000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:24:43.090458 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 19 09:24:43.099000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 19 09:24:43.099000 audit[1607]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffdd032ab60 a2=420 a3=0 items=0 ppid=1544 pid=1607 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:24:43.099000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 19 09:24:43.100057 augenrules[1607]: No rules Jan 19 09:24:43.100390 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 19 09:24:43.104272 systemd[1]: audit-rules.service: Deactivated successfully. Jan 19 09:24:43.104529 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 19 09:24:43.119455 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 19 09:24:43.119690 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 19 09:24:43.122292 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 19 09:24:43.238448 systemd-networkd[1554]: lo: Link UP Jan 19 09:24:43.239494 systemd-networkd[1554]: lo: Gained carrier Jan 19 09:24:43.242838 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 19 09:24:43.244835 systemd[1]: Reached target time-set.target - System Time Set. Jan 19 09:24:43.252536 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 19 09:24:43.253704 systemd-networkd[1554]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 19 09:24:43.253712 systemd-networkd[1554]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 19 09:24:43.254129 systemd[1]: Reached target network.target - Network. Jan 19 09:24:43.255396 systemd-networkd[1554]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 19 09:24:43.255404 systemd-networkd[1554]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 19 09:24:43.256316 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 19 09:24:43.260559 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 19 09:24:43.260802 systemd-networkd[1554]: eth1: Link UP Jan 19 09:24:43.261255 systemd-networkd[1554]: eth1: Gained carrier Jan 19 09:24:43.261269 systemd-networkd[1554]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 19 09:24:43.268694 systemd-networkd[1554]: eth0: Link UP Jan 19 09:24:43.270445 systemd-networkd[1554]: eth0: Gained carrier Jan 19 09:24:43.270463 systemd-networkd[1554]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 19 09:24:43.285164 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 19 09:24:43.290460 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 19 09:24:43.306547 systemd-networkd[1554]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 19 09:24:43.308463 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 19 09:24:43.333888 systemd-networkd[1554]: eth0: DHCPv4 address 77.42.19.180/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 19 09:24:43.335430 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 19 09:24:43.473929 ldconfig[1551]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 19 09:24:43.480676 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 19 09:24:43.485780 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 19 09:24:43.522106 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 19 09:24:43.524078 systemd[1]: Reached target sysinit.target - System Initialization. Jan 19 09:24:43.528207 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 19 09:24:43.529506 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 19 09:24:43.530761 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 19 09:24:43.532198 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 19 09:24:43.533801 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 19 09:24:43.536599 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 19 09:24:43.539096 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 19 09:24:43.540199 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 19 09:24:43.541895 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 19 09:24:43.541943 systemd[1]: Reached target paths.target - Path Units. Jan 19 09:24:43.542803 systemd[1]: Reached target timers.target - Timer Units. Jan 19 09:24:43.549679 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 19 09:24:43.554760 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 19 09:24:43.560876 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 19 09:24:43.564301 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 19 09:24:43.564818 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 19 09:24:43.576666 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 19 09:24:43.579905 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 19 09:24:43.581212 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 19 09:24:43.585964 systemd[1]: Reached target sockets.target - Socket Units. Jan 19 09:24:43.589107 systemd[1]: Reached target basic.target - Basic System. Jan 19 09:24:43.590147 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 19 09:24:43.590199 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 19 09:24:43.592535 systemd[1]: Starting containerd.service - containerd container runtime... Jan 19 09:24:43.605039 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 19 09:24:43.610659 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 19 09:24:43.622911 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 19 09:24:43.632683 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 19 09:24:43.639732 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 19 09:24:43.642263 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 19 09:24:43.649838 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 19 09:24:43.655010 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 19 09:24:43.665672 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 19 09:24:43.675237 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jan 19 09:24:43.682643 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 19 09:24:43.690715 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 19 09:24:43.697727 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 19 09:24:43.698099 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 19 09:24:43.698527 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 19 09:24:43.703563 systemd[1]: Starting update-engine.service - Update Engine... Jan 19 09:24:43.708856 jq[1638]: false Jan 19 09:24:43.710941 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 19 09:24:43.717953 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Refreshing passwd entry cache Jan 19 09:24:43.717953 oslogin_cache_refresh[1640]: Refreshing passwd entry cache Jan 19 09:24:43.728566 oslogin_cache_refresh[1640]: Failure getting users, quitting Jan 19 09:24:43.726841 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 19 09:24:43.728884 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Failure getting users, quitting Jan 19 09:24:43.728884 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 19 09:24:43.728884 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Refreshing group entry cache Jan 19 09:24:43.728583 oslogin_cache_refresh[1640]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 19 09:24:43.727765 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 19 09:24:43.728619 oslogin_cache_refresh[1640]: Refreshing group entry cache Jan 19 09:24:43.728032 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 19 09:24:43.729247 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Failure getting groups, quitting Jan 19 09:24:43.729247 google_oslogin_nss_cache[1640]: oslogin_cache_refresh[1640]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 19 09:24:43.729219 oslogin_cache_refresh[1640]: Failure getting groups, quitting Jan 19 09:24:43.729228 oslogin_cache_refresh[1640]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 19 09:24:43.732066 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 19 09:24:43.732414 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 19 09:24:43.741681 extend-filesystems[1639]: Found /dev/sda6 Jan 19 09:24:43.751589 coreos-metadata[1633]: Jan 19 09:24:43.751 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jan 19 09:24:43.755965 systemd[1]: motdgen.service: Deactivated successfully. Jan 19 09:24:43.760161 coreos-metadata[1633]: Jan 19 09:24:43.754 INFO Fetch successful Jan 19 09:24:43.760161 coreos-metadata[1633]: Jan 19 09:24:43.754 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jan 19 09:24:43.760161 coreos-metadata[1633]: Jan 19 09:24:43.754 INFO Fetch successful Jan 19 09:24:43.757096 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 19 09:24:43.765275 extend-filesystems[1639]: Found /dev/sda9 Jan 19 09:24:43.766396 update_engine[1648]: I20260119 09:24:43.764756 1648 main.cc:92] Flatcar Update Engine starting Jan 19 09:24:43.762320 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 19 09:24:43.773225 extend-filesystems[1639]: Checking size of /dev/sda9 Jan 19 09:24:43.773630 jq[1649]: true Jan 19 09:24:43.763520 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 19 09:24:43.780462 tar[1657]: linux-amd64/LICENSE Jan 19 09:24:43.780462 tar[1657]: linux-amd64/helm Jan 19 09:24:43.784221 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 19 09:24:43.800127 extend-filesystems[1639]: Resized partition /dev/sda9 Jan 19 09:24:43.806509 extend-filesystems[1692]: resize2fs 1.47.3 (8-Jul-2025) Jan 19 09:24:43.821239 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 18410491 blocks Jan 19 09:24:43.821948 jq[1681]: true Jan 19 09:24:43.840558 dbus-daemon[1634]: [system] SELinux support is enabled Jan 19 09:24:43.840984 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 19 09:24:43.846512 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 19 09:24:43.847333 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 19 09:24:43.848523 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 19 09:24:43.848539 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 19 09:24:43.866973 systemd[1]: Started update-engine.service - Update Engine. Jan 19 09:24:43.871205 update_engine[1648]: I20260119 09:24:43.870947 1648 update_check_scheduler.cc:74] Next update check in 6m23s Jan 19 09:24:43.877365 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 19 09:24:43.973793 systemd-logind[1647]: New seat seat0. Jan 19 09:24:43.990891 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 19 09:24:43.991565 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 19 09:24:43.999545 systemd-logind[1647]: Watching system buttons on /dev/input/event3 (Power Button) Jan 19 09:24:43.999567 systemd-logind[1647]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 19 09:24:43.999806 systemd[1]: Started systemd-logind.service - User Login Management. Jan 19 09:24:44.051310 bash[1722]: Updated "/home/core/.ssh/authorized_keys" Jan 19 09:24:44.049943 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 19 09:24:44.057101 systemd[1]: Starting sshkeys.service... Jan 19 09:24:44.091688 containerd[1676]: time="2026-01-19T09:24:44Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 19 09:24:44.097691 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 19 09:24:44.102465 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 19 09:24:44.105949 containerd[1676]: time="2026-01-19T09:24:44.105045545Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137197402Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.45µs" Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137232332Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137268182Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137280652Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137422472Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137451752Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137519382Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137529042Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137729232Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137741082Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137749752Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139217 containerd[1676]: time="2026-01-19T09:24:44.137758382Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139461 containerd[1676]: time="2026-01-19T09:24:44.137889563Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139461 containerd[1676]: time="2026-01-19T09:24:44.137902063Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139461 containerd[1676]: time="2026-01-19T09:24:44.137963263Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139461 containerd[1676]: time="2026-01-19T09:24:44.138156093Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139461 containerd[1676]: time="2026-01-19T09:24:44.138188663Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 19 09:24:44.139461 containerd[1676]: time="2026-01-19T09:24:44.138197913Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 19 09:24:44.139461 containerd[1676]: time="2026-01-19T09:24:44.138228133Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 19 09:24:44.143749 containerd[1676]: time="2026-01-19T09:24:44.142801317Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 19 09:24:44.143749 containerd[1676]: time="2026-01-19T09:24:44.142896567Z" level=info msg="metadata content store policy set" policy=shared Jan 19 09:24:44.159925 coreos-metadata[1729]: Jan 19 09:24:44.159 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jan 19 09:24:44.167452 coreos-metadata[1729]: Jan 19 09:24:44.167 INFO Fetch successful Jan 19 09:24:44.175006 locksmithd[1701]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 19 09:24:44.177227 containerd[1676]: time="2026-01-19T09:24:44.177185805Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 19 09:24:44.177267 containerd[1676]: time="2026-01-19T09:24:44.177251535Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 19 09:24:44.177343 containerd[1676]: time="2026-01-19T09:24:44.177328145Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 19 09:24:44.177343 containerd[1676]: time="2026-01-19T09:24:44.177340735Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177351745Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177363575Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177376555Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177384255Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177393595Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177404255Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177421635Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177429065Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177454306Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 19 09:24:44.177553 containerd[1676]: time="2026-01-19T09:24:44.177464286Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177704906Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177724886Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177736936Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177745186Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177767786Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177775866Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177795616Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177803466Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177811656Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177819996Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177827666Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177860286Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177900616Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177910696Z" level=info msg="Start snapshots syncer" Jan 19 09:24:44.185917 containerd[1676]: time="2026-01-19T09:24:44.177943726Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 19 09:24:44.177578 unknown[1729]: wrote ssh authorized keys file for user: core Jan 19 09:24:44.186399 containerd[1676]: time="2026-01-19T09:24:44.178270356Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 19 09:24:44.186399 containerd[1676]: time="2026-01-19T09:24:44.178305446Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.178359526Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.185893453Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.185923323Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.185933833Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.185941863Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.185950993Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.185966303Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.185976963Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.185987533Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.185997763Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.186020053Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.186033983Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 19 09:24:44.186651 containerd[1676]: time="2026-01-19T09:24:44.186040273Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 19 09:24:44.186915 containerd[1676]: time="2026-01-19T09:24:44.186047083Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 19 09:24:44.186915 containerd[1676]: time="2026-01-19T09:24:44.186053293Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 19 09:24:44.186915 containerd[1676]: time="2026-01-19T09:24:44.186070223Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 19 09:24:44.186915 containerd[1676]: time="2026-01-19T09:24:44.186078503Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 19 09:24:44.186915 containerd[1676]: time="2026-01-19T09:24:44.186092513Z" level=info msg="runtime interface created" Jan 19 09:24:44.186915 containerd[1676]: time="2026-01-19T09:24:44.186096933Z" level=info msg="created NRI interface" Jan 19 09:24:44.186915 containerd[1676]: time="2026-01-19T09:24:44.186103673Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 19 09:24:44.186915 containerd[1676]: time="2026-01-19T09:24:44.186115123Z" level=info msg="Connect containerd service" Jan 19 09:24:44.186915 containerd[1676]: time="2026-01-19T09:24:44.186129913Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 19 09:24:44.187939 containerd[1676]: time="2026-01-19T09:24:44.187918654Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 19 09:24:44.221153 kernel: EXT4-fs (sda9): resized filesystem to 18410491 Jan 19 09:24:44.257978 extend-filesystems[1692]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jan 19 09:24:44.257978 extend-filesystems[1692]: old_desc_blocks = 1, new_desc_blocks = 9 Jan 19 09:24:44.257978 extend-filesystems[1692]: The filesystem on /dev/sda9 is now 18410491 (4k) blocks long. Jan 19 09:24:44.270560 extend-filesystems[1639]: Resized filesystem in /dev/sda9 Jan 19 09:24:44.263456 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 19 09:24:44.263826 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 19 09:24:44.290383 update-ssh-keys[1740]: Updated "/home/core/.ssh/authorized_keys" Jan 19 09:24:44.276765 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 19 09:24:44.281517 systemd[1]: Finished sshkeys.service. Jan 19 09:24:44.320021 containerd[1676]: time="2026-01-19T09:24:44.319864734Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 19 09:24:44.321145 containerd[1676]: time="2026-01-19T09:24:44.320206124Z" level=info msg="Start subscribing containerd event" Jan 19 09:24:44.321145 containerd[1676]: time="2026-01-19T09:24:44.320543585Z" level=info msg="Start recovering state" Jan 19 09:24:44.321145 containerd[1676]: time="2026-01-19T09:24:44.320640505Z" level=info msg="Start event monitor" Jan 19 09:24:44.321145 containerd[1676]: time="2026-01-19T09:24:44.320654615Z" level=info msg="Start cni network conf syncer for default" Jan 19 09:24:44.321145 containerd[1676]: time="2026-01-19T09:24:44.320665835Z" level=info msg="Start streaming server" Jan 19 09:24:44.321145 containerd[1676]: time="2026-01-19T09:24:44.320674785Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 19 09:24:44.321145 containerd[1676]: time="2026-01-19T09:24:44.320682655Z" level=info msg="runtime interface starting up..." Jan 19 09:24:44.321145 containerd[1676]: time="2026-01-19T09:24:44.320690485Z" level=info msg="starting plugins..." Jan 19 09:24:44.321145 containerd[1676]: time="2026-01-19T09:24:44.320703135Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 19 09:24:44.321814 containerd[1676]: time="2026-01-19T09:24:44.321799946Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 19 09:24:44.323453 containerd[1676]: time="2026-01-19T09:24:44.323259907Z" level=info msg="containerd successfully booted in 0.232500s" Jan 19 09:24:44.323562 systemd[1]: Started containerd.service - containerd container runtime. Jan 19 09:24:44.345756 sshd_keygen[1688]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 19 09:24:44.366035 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 19 09:24:44.370618 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 19 09:24:44.374379 systemd[1]: Started sshd@0-77.42.19.180:22-207.154.232.101:6116.service - OpenSSH per-connection server daemon (207.154.232.101:6116). Jan 19 09:24:44.389531 systemd[1]: issuegen.service: Deactivated successfully. Jan 19 09:24:44.390493 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 19 09:24:44.396616 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 19 09:24:44.415034 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 19 09:24:44.420338 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 19 09:24:44.424869 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 19 09:24:44.428848 systemd[1]: Reached target getty.target - Login Prompts. Jan 19 09:24:44.443445 tar[1657]: linux-amd64/README.md Jan 19 09:24:44.460199 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 19 09:24:44.514121 sshd[1764]: Unable to negotiate with 207.154.232.101 port 6116: no matching key exchange method found. Their offer: diffie-hellman-group14-sha1,diffie-hellman-group1-sha1,diffie-hellman-group14-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group-exchange-sha256 [preauth] Jan 19 09:24:44.516834 systemd[1]: sshd@0-77.42.19.180:22-207.154.232.101:6116.service: Deactivated successfully. Jan 19 09:24:44.676789 systemd-networkd[1554]: eth0: Gained IPv6LL Jan 19 09:24:44.677765 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 19 09:24:44.682846 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 19 09:24:44.687338 systemd[1]: Reached target network-online.target - Network is Online. Jan 19 09:24:44.694542 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:24:44.700995 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 19 09:24:44.751498 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 19 09:24:45.189799 systemd-networkd[1554]: eth1: Gained IPv6LL Jan 19 09:24:45.191857 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 19 09:24:46.164479 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:24:46.170105 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 19 09:24:46.171718 (kubelet)[1797]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 19 09:24:46.175428 systemd[1]: Startup finished in 3.330s (kernel) + 5.688s (initrd) + 5.604s (userspace) = 14.623s. Jan 19 09:24:46.842193 kubelet[1797]: E0119 09:24:46.842076 1797 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 19 09:24:46.847620 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 19 09:24:46.848037 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 19 09:24:46.849047 systemd[1]: kubelet.service: Consumed 1.401s CPU time, 258M memory peak. Jan 19 09:24:57.098302 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 19 09:24:57.100270 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:24:57.350159 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:24:57.362996 (kubelet)[1816]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 19 09:24:57.429549 kubelet[1816]: E0119 09:24:57.429427 1816 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 19 09:24:57.438663 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 19 09:24:57.439077 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 19 09:24:57.440047 systemd[1]: kubelet.service: Consumed 267ms CPU time, 110.5M memory peak. Jan 19 09:25:07.689259 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 19 09:25:07.692044 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:25:07.910651 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:25:07.931051 (kubelet)[1831]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 19 09:25:07.963167 kubelet[1831]: E0119 09:25:07.963045 1831 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 19 09:25:07.965529 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 19 09:25:07.965905 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 19 09:25:07.966806 systemd[1]: kubelet.service: Consumed 207ms CPU time, 110.3M memory peak. Jan 19 09:25:16.288307 systemd-timesyncd[1580]: Contacted time server 49.13.223.236:123 (2.flatcar.pool.ntp.org). Jan 19 09:25:16.288639 systemd-timesyncd[1580]: Initial clock synchronization to Mon 2026-01-19 09:25:16.288089 UTC. Jan 19 09:25:16.289605 systemd-resolved[1325]: Clock change detected. Flushing caches. Jan 19 09:25:17.152188 systemd[1]: Started sshd@1-77.42.19.180:22-4.153.228.146:57732.service - OpenSSH per-connection server daemon (4.153.228.146:57732). Jan 19 09:25:17.837426 sshd[1839]: Accepted publickey for core from 4.153.228.146 port 57732 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:25:17.840074 sshd-session[1839]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:25:17.846529 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 19 09:25:17.848163 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 19 09:25:17.853829 systemd-logind[1647]: New session 1 of user core. Jan 19 09:25:17.864764 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 19 09:25:17.867907 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 19 09:25:17.897515 (systemd)[1845]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:25:17.901264 systemd-logind[1647]: New session 2 of user core. Jan 19 09:25:18.088471 systemd[1845]: Queued start job for default target default.target. Jan 19 09:25:18.099656 systemd[1845]: Created slice app.slice - User Application Slice. Jan 19 09:25:18.099685 systemd[1845]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 19 09:25:18.099697 systemd[1845]: Reached target paths.target - Paths. Jan 19 09:25:18.099757 systemd[1845]: Reached target timers.target - Timers. Jan 19 09:25:18.101358 systemd[1845]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 19 09:25:18.104546 systemd[1845]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 19 09:25:18.120181 systemd[1845]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 19 09:25:18.125616 systemd[1845]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 19 09:25:18.125851 systemd[1845]: Reached target sockets.target - Sockets. Jan 19 09:25:18.126138 systemd[1845]: Reached target basic.target - Basic System. Jan 19 09:25:18.126588 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 19 09:25:18.127531 systemd[1845]: Reached target default.target - Main User Target. Jan 19 09:25:18.127634 systemd[1845]: Startup finished in 215ms. Jan 19 09:25:18.134617 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 19 09:25:18.523676 systemd[1]: Started sshd@2-77.42.19.180:22-4.153.228.146:57736.service - OpenSSH per-connection server daemon (4.153.228.146:57736). Jan 19 09:25:18.788483 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 19 09:25:18.790878 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:25:19.005741 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:25:19.018868 (kubelet)[1870]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 19 09:25:19.056896 kubelet[1870]: E0119 09:25:19.056728 1870 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 19 09:25:19.060217 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 19 09:25:19.060523 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 19 09:25:19.061243 systemd[1]: kubelet.service: Consumed 210ms CPU time, 110.1M memory peak. Jan 19 09:25:19.195314 sshd[1859]: Accepted publickey for core from 4.153.228.146 port 57736 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:25:19.197581 sshd-session[1859]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:25:19.204449 systemd-logind[1647]: New session 3 of user core. Jan 19 09:25:19.209598 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 19 09:25:19.566759 sshd[1878]: Connection closed by 4.153.228.146 port 57736 Jan 19 09:25:19.567866 sshd-session[1859]: pam_unix(sshd:session): session closed for user core Jan 19 09:25:19.574863 systemd[1]: sshd@2-77.42.19.180:22-4.153.228.146:57736.service: Deactivated successfully. Jan 19 09:25:19.579007 systemd[1]: session-3.scope: Deactivated successfully. Jan 19 09:25:19.580328 systemd-logind[1647]: Session 3 logged out. Waiting for processes to exit. Jan 19 09:25:19.582266 systemd-logind[1647]: Removed session 3. Jan 19 09:25:19.715616 systemd[1]: Started sshd@3-77.42.19.180:22-4.153.228.146:57742.service - OpenSSH per-connection server daemon (4.153.228.146:57742). Jan 19 09:25:20.385496 sshd[1884]: Accepted publickey for core from 4.153.228.146 port 57742 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:25:20.387883 sshd-session[1884]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:25:20.399484 systemd-logind[1647]: New session 4 of user core. Jan 19 09:25:20.407677 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 19 09:25:20.751739 sshd[1888]: Connection closed by 4.153.228.146 port 57742 Jan 19 09:25:20.752801 sshd-session[1884]: pam_unix(sshd:session): session closed for user core Jan 19 09:25:20.761001 systemd[1]: sshd@3-77.42.19.180:22-4.153.228.146:57742.service: Deactivated successfully. Jan 19 09:25:20.764800 systemd[1]: session-4.scope: Deactivated successfully. Jan 19 09:25:20.766119 systemd-logind[1647]: Session 4 logged out. Waiting for processes to exit. Jan 19 09:25:20.767852 systemd-logind[1647]: Removed session 4. Jan 19 09:25:20.888019 systemd[1]: Started sshd@4-77.42.19.180:22-4.153.228.146:57750.service - OpenSSH per-connection server daemon (4.153.228.146:57750). Jan 19 09:25:21.581457 sshd[1894]: Accepted publickey for core from 4.153.228.146 port 57750 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:25:21.584635 sshd-session[1894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:25:21.595358 systemd-logind[1647]: New session 5 of user core. Jan 19 09:25:21.602708 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 19 09:25:21.952599 sshd[1898]: Connection closed by 4.153.228.146 port 57750 Jan 19 09:25:21.954703 sshd-session[1894]: pam_unix(sshd:session): session closed for user core Jan 19 09:25:21.962589 systemd[1]: sshd@4-77.42.19.180:22-4.153.228.146:57750.service: Deactivated successfully. Jan 19 09:25:21.967044 systemd[1]: session-5.scope: Deactivated successfully. Jan 19 09:25:21.968606 systemd-logind[1647]: Session 5 logged out. Waiting for processes to exit. Jan 19 09:25:21.971349 systemd-logind[1647]: Removed session 5. Jan 19 09:25:22.090106 systemd[1]: Started sshd@5-77.42.19.180:22-4.153.228.146:57758.service - OpenSSH per-connection server daemon (4.153.228.146:57758). Jan 19 09:25:22.754442 sshd[1904]: Accepted publickey for core from 4.153.228.146 port 57758 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:25:22.756058 sshd-session[1904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:25:22.762655 systemd-logind[1647]: New session 6 of user core. Jan 19 09:25:22.768785 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 19 09:25:23.016725 sudo[1909]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 19 09:25:23.017165 sudo[1909]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 19 09:25:23.039332 sudo[1909]: pam_unix(sudo:session): session closed for user root Jan 19 09:25:23.161669 sshd[1908]: Connection closed by 4.153.228.146 port 57758 Jan 19 09:25:23.163727 sshd-session[1904]: pam_unix(sshd:session): session closed for user core Jan 19 09:25:23.170667 systemd[1]: sshd@5-77.42.19.180:22-4.153.228.146:57758.service: Deactivated successfully. Jan 19 09:25:23.173895 systemd[1]: session-6.scope: Deactivated successfully. Jan 19 09:25:23.176435 systemd-logind[1647]: Session 6 logged out. Waiting for processes to exit. Jan 19 09:25:23.177731 systemd-logind[1647]: Removed session 6. Jan 19 09:25:23.301136 systemd[1]: Started sshd@6-77.42.19.180:22-4.153.228.146:57762.service - OpenSSH per-connection server daemon (4.153.228.146:57762). Jan 19 09:25:23.970458 sshd[1916]: Accepted publickey for core from 4.153.228.146 port 57762 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:25:23.972385 sshd-session[1916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:25:23.978247 systemd-logind[1647]: New session 7 of user core. Jan 19 09:25:23.990669 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 19 09:25:24.223812 sudo[1922]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 19 09:25:24.224206 sudo[1922]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 19 09:25:24.227754 sudo[1922]: pam_unix(sudo:session): session closed for user root Jan 19 09:25:24.236837 sudo[1921]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 19 09:25:24.237275 sudo[1921]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 19 09:25:24.246785 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 19 09:25:24.295457 kernel: kauditd_printk_skb: 83 callbacks suppressed Jan 19 09:25:24.295590 kernel: audit: type=1305 audit(1768814724.292:240): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 19 09:25:24.292000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 19 09:25:24.295693 augenrules[1946]: No rules Jan 19 09:25:24.297460 kernel: audit: type=1300 audit(1768814724.292:240): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff143ce200 a2=420 a3=0 items=0 ppid=1927 pid=1946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:24.292000 audit[1946]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff143ce200 a2=420 a3=0 items=0 ppid=1927 pid=1946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:24.298718 systemd[1]: audit-rules.service: Deactivated successfully. Jan 19 09:25:24.299081 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 19 09:25:24.292000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 19 09:25:24.302709 sudo[1921]: pam_unix(sudo:session): session closed for user root Jan 19 09:25:24.305761 kernel: audit: type=1327 audit(1768814724.292:240): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 19 09:25:24.305818 kernel: audit: type=1130 audit(1768814724.297:241): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.297000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.297000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.314101 kernel: audit: type=1131 audit(1768814724.297:242): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.314196 kernel: audit: type=1106 audit(1768814724.302:243): pid=1921 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.302000 audit[1921]: USER_END pid=1921 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.302000 audit[1921]: CRED_DISP pid=1921 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.322831 kernel: audit: type=1104 audit(1768814724.302:244): pid=1921 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.427241 sshd[1920]: Connection closed by 4.153.228.146 port 57762 Jan 19 09:25:24.428671 sshd-session[1916]: pam_unix(sshd:session): session closed for user core Jan 19 09:25:24.432000 audit[1916]: USER_END pid=1916 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:25:24.440538 kernel: audit: type=1106 audit(1768814724.432:245): pid=1916 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:25:24.432000 audit[1916]: CRED_DISP pid=1916 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:25:24.443986 systemd[1]: sshd@6-77.42.19.180:22-4.153.228.146:57762.service: Deactivated successfully. Jan 19 09:25:24.445503 kernel: audit: type=1104 audit(1768814724.432:246): pid=1916 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:25:24.440000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.19.180:22-4.153.228.146:57762 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.449083 systemd[1]: session-7.scope: Deactivated successfully. Jan 19 09:25:24.451440 kernel: audit: type=1131 audit(1768814724.440:247): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-77.42.19.180:22-4.153.228.146:57762 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.454566 systemd-logind[1647]: Session 7 logged out. Waiting for processes to exit. Jan 19 09:25:24.456283 systemd-logind[1647]: Removed session 7. Jan 19 09:25:24.558000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.19.180:22-4.153.228.146:57776 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:24.558894 systemd[1]: Started sshd@7-77.42.19.180:22-4.153.228.146:57776.service - OpenSSH per-connection server daemon (4.153.228.146:57776). Jan 19 09:25:25.224000 audit[1955]: USER_ACCT pid=1955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:25:25.225284 sshd[1955]: Accepted publickey for core from 4.153.228.146 port 57776 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:25:25.226000 audit[1955]: CRED_ACQ pid=1955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:25:25.226000 audit[1955]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd96407820 a2=3 a3=0 items=0 ppid=1 pid=1955 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:25.226000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:25:25.228888 sshd-session[1955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:25:25.237837 systemd-logind[1647]: New session 8 of user core. Jan 19 09:25:25.246678 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 19 09:25:25.251000 audit[1955]: USER_START pid=1955 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:25:25.254000 audit[1959]: CRED_ACQ pid=1959 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:25:25.481000 audit[1960]: USER_ACCT pid=1960 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:25:25.482554 sudo[1960]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 19 09:25:25.482000 audit[1960]: CRED_REFR pid=1960 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:25:25.483272 sudo[1960]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 19 09:25:25.482000 audit[1960]: USER_START pid=1960 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:25:26.114162 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 19 09:25:26.123699 (dockerd)[1978]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 19 09:25:26.484906 dockerd[1978]: time="2026-01-19T09:25:26.484499692Z" level=info msg="Starting up" Jan 19 09:25:26.488857 dockerd[1978]: time="2026-01-19T09:25:26.488815205Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 19 09:25:26.510785 dockerd[1978]: time="2026-01-19T09:25:26.510694443Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 19 09:25:26.553980 systemd[1]: var-lib-docker-metacopy\x2dcheck1086987169-merged.mount: Deactivated successfully. Jan 19 09:25:26.590735 dockerd[1978]: time="2026-01-19T09:25:26.590384680Z" level=info msg="Loading containers: start." Jan 19 09:25:26.605432 kernel: Initializing XFRM netlink socket Jan 19 09:25:26.712000 audit[2027]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2027 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.712000 audit[2027]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffda4f23700 a2=0 a3=0 items=0 ppid=1978 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.712000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 19 09:25:26.715000 audit[2029]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.715000 audit[2029]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe44afa010 a2=0 a3=0 items=0 ppid=1978 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.715000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 19 09:25:26.718000 audit[2031]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.718000 audit[2031]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffeb52d1a00 a2=0 a3=0 items=0 ppid=1978 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.718000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 19 09:25:26.722000 audit[2033]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.722000 audit[2033]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd6ea741a0 a2=0 a3=0 items=0 ppid=1978 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.722000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 19 09:25:26.724000 audit[2035]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.724000 audit[2035]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffdb448f850 a2=0 a3=0 items=0 ppid=1978 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.724000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 19 09:25:26.728000 audit[2037]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.728000 audit[2037]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd7f3ce180 a2=0 a3=0 items=0 ppid=1978 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.728000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 19 09:25:26.732000 audit[2039]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.732000 audit[2039]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff0de00ad0 a2=0 a3=0 items=0 ppid=1978 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.732000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 19 09:25:26.736000 audit[2041]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.736000 audit[2041]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffc287e0ce0 a2=0 a3=0 items=0 ppid=1978 pid=2041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.736000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 19 09:25:26.768000 audit[2044]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2044 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.768000 audit[2044]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffc13d74470 a2=0 a3=0 items=0 ppid=1978 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.768000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 19 09:25:26.772000 audit[2046]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2046 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.772000 audit[2046]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd0d849350 a2=0 a3=0 items=0 ppid=1978 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.772000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 19 09:25:26.776000 audit[2048]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.776000 audit[2048]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffc6c908880 a2=0 a3=0 items=0 ppid=1978 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.776000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 19 09:25:26.779000 audit[2050]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.779000 audit[2050]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7fffe71bf670 a2=0 a3=0 items=0 ppid=1978 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.779000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 19 09:25:26.783000 audit[2052]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.783000 audit[2052]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7fffd6a4af10 a2=0 a3=0 items=0 ppid=1978 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.783000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 19 09:25:26.866000 audit[2082]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2082 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.866000 audit[2082]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffc72e5d680 a2=0 a3=0 items=0 ppid=1978 pid=2082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.866000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 19 09:25:26.872000 audit[2084]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2084 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.872000 audit[2084]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffccd0a8cd0 a2=0 a3=0 items=0 ppid=1978 pid=2084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.872000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 19 09:25:26.876000 audit[2086]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2086 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.876000 audit[2086]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffa92168c0 a2=0 a3=0 items=0 ppid=1978 pid=2086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.876000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 19 09:25:26.881000 audit[2088]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.881000 audit[2088]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff55c76900 a2=0 a3=0 items=0 ppid=1978 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.881000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 19 09:25:26.886000 audit[2090]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.886000 audit[2090]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff5aa155f0 a2=0 a3=0 items=0 ppid=1978 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.886000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 19 09:25:26.891000 audit[2092]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.891000 audit[2092]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd208d6e80 a2=0 a3=0 items=0 ppid=1978 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.891000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 19 09:25:26.896000 audit[2094]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.896000 audit[2094]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffc22b01fc0 a2=0 a3=0 items=0 ppid=1978 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.896000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 19 09:25:26.901000 audit[2096]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.901000 audit[2096]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7fff174e6ae0 a2=0 a3=0 items=0 ppid=1978 pid=2096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.901000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 19 09:25:26.907000 audit[2098]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2098 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.907000 audit[2098]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffc22a18280 a2=0 a3=0 items=0 ppid=1978 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.907000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 19 09:25:26.911000 audit[2100]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.911000 audit[2100]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd84db8210 a2=0 a3=0 items=0 ppid=1978 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.911000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 19 09:25:26.916000 audit[2102]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.916000 audit[2102]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffecac62e50 a2=0 a3=0 items=0 ppid=1978 pid=2102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.916000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 19 09:25:26.921000 audit[2104]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.921000 audit[2104]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffe85a5cf20 a2=0 a3=0 items=0 ppid=1978 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.921000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 19 09:25:26.926000 audit[2106]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.926000 audit[2106]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7fff594eddb0 a2=0 a3=0 items=0 ppid=1978 pid=2106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.926000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 19 09:25:26.939000 audit[2111]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.939000 audit[2111]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff6af7c890 a2=0 a3=0 items=0 ppid=1978 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.939000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 19 09:25:26.944000 audit[2113]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.944000 audit[2113]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffc9d16fb40 a2=0 a3=0 items=0 ppid=1978 pid=2113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.944000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 19 09:25:26.949000 audit[2115]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:26.949000 audit[2115]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffc7ee3ca60 a2=0 a3=0 items=0 ppid=1978 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.949000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 19 09:25:26.954000 audit[2117]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2117 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.954000 audit[2117]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe2f171e70 a2=0 a3=0 items=0 ppid=1978 pid=2117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.954000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 19 09:25:26.959000 audit[2119]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2119 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.959000 audit[2119]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffd87f17120 a2=0 a3=0 items=0 ppid=1978 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.959000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 19 09:25:26.963000 audit[2121]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:26.963000 audit[2121]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffd73fc31c0 a2=0 a3=0 items=0 ppid=1978 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:26.963000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 19 09:25:27.005000 audit[2125]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2125 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:27.005000 audit[2125]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffd6d27e4d0 a2=0 a3=0 items=0 ppid=1978 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:27.005000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 19 09:25:27.011000 audit[2127]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:27.011000 audit[2127]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffd0ee27910 a2=0 a3=0 items=0 ppid=1978 pid=2127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:27.011000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 19 09:25:27.033000 audit[2135]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2135 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:27.033000 audit[2135]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffe19ee4e70 a2=0 a3=0 items=0 ppid=1978 pid=2135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:27.033000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 19 09:25:27.055000 audit[2141]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:27.055000 audit[2141]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffe718e0fa0 a2=0 a3=0 items=0 ppid=1978 pid=2141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:27.055000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 19 09:25:27.061000 audit[2143]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:27.061000 audit[2143]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffed9bf8750 a2=0 a3=0 items=0 ppid=1978 pid=2143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:27.061000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 19 09:25:27.066000 audit[2145]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:27.066000 audit[2145]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd0e4a0120 a2=0 a3=0 items=0 ppid=1978 pid=2145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:27.066000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 19 09:25:27.071000 audit[2147]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:27.071000 audit[2147]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7fffb914a120 a2=0 a3=0 items=0 ppid=1978 pid=2147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:27.071000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 19 09:25:27.077000 audit[2149]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:27.077000 audit[2149]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffc31c2a320 a2=0 a3=0 items=0 ppid=1978 pid=2149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:27.077000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 19 09:25:27.078687 systemd-networkd[1554]: docker0: Link UP Jan 19 09:25:27.087501 dockerd[1978]: time="2026-01-19T09:25:27.087379294Z" level=info msg="Loading containers: done." Jan 19 09:25:27.115158 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2394303323-merged.mount: Deactivated successfully. Jan 19 09:25:27.141621 dockerd[1978]: time="2026-01-19T09:25:27.141510639Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 19 09:25:27.141899 dockerd[1978]: time="2026-01-19T09:25:27.141656789Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 19 09:25:27.141899 dockerd[1978]: time="2026-01-19T09:25:27.141821069Z" level=info msg="Initializing buildkit" Jan 19 09:25:27.184876 dockerd[1978]: time="2026-01-19T09:25:27.184780655Z" level=info msg="Completed buildkit initialization" Jan 19 09:25:27.196506 dockerd[1978]: time="2026-01-19T09:25:27.196344605Z" level=info msg="Daemon has completed initialization" Jan 19 09:25:27.196745 dockerd[1978]: time="2026-01-19T09:25:27.196668355Z" level=info msg="API listen on /run/docker.sock" Jan 19 09:25:27.197000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:27.197206 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 19 09:25:28.467412 containerd[1676]: time="2026-01-19T09:25:28.467331964Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Jan 19 09:25:29.289087 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 19 09:25:29.293805 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:25:29.523190 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1560700302.mount: Deactivated successfully. Jan 19 09:25:29.554665 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:25:29.556827 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 19 09:25:29.556927 kernel: audit: type=1130 audit(1768814729.554:298): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:29.554000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:29.569906 (kubelet)[2206]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 19 09:25:29.629226 kubelet[2206]: E0119 09:25:29.629155 2206 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 19 09:25:29.634552 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 19 09:25:29.634899 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 19 09:25:29.635000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 19 09:25:29.641476 kernel: audit: type=1131 audit(1768814729.635:299): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 19 09:25:29.636451 systemd[1]: kubelet.service: Consumed 256ms CPU time, 110.4M memory peak. Jan 19 09:25:30.247159 update_engine[1648]: I20260119 09:25:30.247068 1648 update_attempter.cc:509] Updating boot flags... Jan 19 09:25:30.689965 containerd[1676]: time="2026-01-19T09:25:30.689905655Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:30.691421 containerd[1676]: time="2026-01-19T09:25:30.691332086Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=25399329" Jan 19 09:25:30.692405 containerd[1676]: time="2026-01-19T09:25:30.692355067Z" level=info msg="ImageCreate event name:\"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:30.694335 containerd[1676]: time="2026-01-19T09:25:30.694305229Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:30.695291 containerd[1676]: time="2026-01-19T09:25:30.694965929Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"27064672\" in 2.227579765s" Jan 19 09:25:30.695291 containerd[1676]: time="2026-01-19T09:25:30.694994489Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\"" Jan 19 09:25:30.695998 containerd[1676]: time="2026-01-19T09:25:30.695955320Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Jan 19 09:25:32.305018 containerd[1676]: time="2026-01-19T09:25:32.304934630Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:32.307801 containerd[1676]: time="2026-01-19T09:25:32.307592963Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=21154285" Jan 19 09:25:32.309019 containerd[1676]: time="2026-01-19T09:25:32.308948344Z" level=info msg="ImageCreate event name:\"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:32.311232 containerd[1676]: time="2026-01-19T09:25:32.311198826Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:32.312131 containerd[1676]: time="2026-01-19T09:25:32.312109756Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"22819474\" in 1.616126006s" Jan 19 09:25:32.312328 containerd[1676]: time="2026-01-19T09:25:32.312202447Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\"" Jan 19 09:25:32.313208 containerd[1676]: time="2026-01-19T09:25:32.313174447Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Jan 19 09:25:33.523765 containerd[1676]: time="2026-01-19T09:25:33.523697636Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:33.528421 containerd[1676]: time="2026-01-19T09:25:33.528048789Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=15717792" Jan 19 09:25:33.530942 containerd[1676]: time="2026-01-19T09:25:33.530899042Z" level=info msg="ImageCreate event name:\"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:33.536124 containerd[1676]: time="2026-01-19T09:25:33.536047876Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:33.537371 containerd[1676]: time="2026-01-19T09:25:33.537338497Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"17382979\" in 1.22408047s" Jan 19 09:25:33.537503 containerd[1676]: time="2026-01-19T09:25:33.537484297Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\"" Jan 19 09:25:33.538213 containerd[1676]: time="2026-01-19T09:25:33.538171258Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Jan 19 09:25:34.755795 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3464434672.mount: Deactivated successfully. Jan 19 09:25:35.077640 containerd[1676]: time="2026-01-19T09:25:35.077539480Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:35.079124 containerd[1676]: time="2026-01-19T09:25:35.079080202Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=14375786" Jan 19 09:25:35.080119 containerd[1676]: time="2026-01-19T09:25:35.080075462Z" level=info msg="ImageCreate event name:\"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:35.082286 containerd[1676]: time="2026-01-19T09:25:35.082193674Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:35.082683 containerd[1676]: time="2026-01-19T09:25:35.082491324Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"25964312\" in 1.544273176s" Jan 19 09:25:35.082683 containerd[1676]: time="2026-01-19T09:25:35.082515514Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\"" Jan 19 09:25:35.083867 containerd[1676]: time="2026-01-19T09:25:35.083834175Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Jan 19 09:25:35.598144 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3079668882.mount: Deactivated successfully. Jan 19 09:25:36.425975 containerd[1676]: time="2026-01-19T09:25:36.425923234Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:36.428233 containerd[1676]: time="2026-01-19T09:25:36.428029225Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=21568511" Jan 19 09:25:36.429417 containerd[1676]: time="2026-01-19T09:25:36.429381496Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:36.431718 containerd[1676]: time="2026-01-19T09:25:36.431694888Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:36.432435 containerd[1676]: time="2026-01-19T09:25:36.432412929Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.348544063s" Jan 19 09:25:36.432480 containerd[1676]: time="2026-01-19T09:25:36.432438669Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Jan 19 09:25:36.433394 containerd[1676]: time="2026-01-19T09:25:36.433370200Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Jan 19 09:25:36.961785 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2190416770.mount: Deactivated successfully. Jan 19 09:25:36.971893 containerd[1676]: time="2026-01-19T09:25:36.971607498Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:36.973478 containerd[1676]: time="2026-01-19T09:25:36.973366050Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Jan 19 09:25:36.975009 containerd[1676]: time="2026-01-19T09:25:36.974913451Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:36.985050 containerd[1676]: time="2026-01-19T09:25:36.984036168Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:36.986468 containerd[1676]: time="2026-01-19T09:25:36.986386710Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 552.98498ms" Jan 19 09:25:36.986468 containerd[1676]: time="2026-01-19T09:25:36.986458851Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Jan 19 09:25:36.987204 containerd[1676]: time="2026-01-19T09:25:36.987112081Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Jan 19 09:25:37.563508 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1708429296.mount: Deactivated successfully. Jan 19 09:25:39.759784 containerd[1676]: time="2026-01-19T09:25:39.759723331Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:39.762005 containerd[1676]: time="2026-01-19T09:25:39.761493852Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=72348001" Jan 19 09:25:39.762848 containerd[1676]: time="2026-01-19T09:25:39.762812543Z" level=info msg="ImageCreate event name:\"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:39.767454 containerd[1676]: time="2026-01-19T09:25:39.767417717Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:25:39.768560 containerd[1676]: time="2026-01-19T09:25:39.768527558Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"74311308\" in 2.781369097s" Jan 19 09:25:39.768673 containerd[1676]: time="2026-01-19T09:25:39.768655598Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\"" Jan 19 09:25:39.788513 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 19 09:25:39.791402 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:25:39.963281 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:25:39.964000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:39.978449 kernel: audit: type=1130 audit(1768814739.964:300): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:39.984851 (kubelet)[2424]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 19 09:25:40.020193 kubelet[2424]: E0119 09:25:40.020051 2424 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 19 09:25:40.023449 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 19 09:25:40.023663 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 19 09:25:40.023000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 19 09:25:40.024184 systemd[1]: kubelet.service: Consumed 175ms CPU time, 108.6M memory peak. Jan 19 09:25:40.029657 kernel: audit: type=1131 audit(1768814740.023:301): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 19 09:25:42.918933 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:25:42.919726 systemd[1]: kubelet.service: Consumed 175ms CPU time, 108.6M memory peak. Jan 19 09:25:42.919000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:42.919000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:42.929899 kernel: audit: type=1130 audit(1768814742.919:302): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:42.930004 kernel: audit: type=1131 audit(1768814742.919:303): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:42.927576 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:25:42.962688 systemd[1]: Reload requested from client PID 2451 ('systemctl') (unit session-8.scope)... Jan 19 09:25:42.962711 systemd[1]: Reloading... Jan 19 09:25:43.098788 zram_generator::config[2500]: No configuration found. Jan 19 09:25:43.281493 systemd[1]: Reloading finished in 318 ms. Jan 19 09:25:43.301467 kernel: audit: type=1334 audit(1768814743.297:304): prog-id=63 op=LOAD Jan 19 09:25:43.297000 audit: BPF prog-id=63 op=LOAD Jan 19 09:25:43.305418 kernel: audit: type=1334 audit(1768814743.297:305): prog-id=59 op=UNLOAD Jan 19 09:25:43.297000 audit: BPF prog-id=59 op=UNLOAD Jan 19 09:25:43.308467 kernel: audit: type=1334 audit(1768814743.301:306): prog-id=64 op=LOAD Jan 19 09:25:43.301000 audit: BPF prog-id=64 op=LOAD Jan 19 09:25:43.312423 kernel: audit: type=1334 audit(1768814743.301:307): prog-id=58 op=UNLOAD Jan 19 09:25:43.301000 audit: BPF prog-id=58 op=UNLOAD Jan 19 09:25:43.304000 audit: BPF prog-id=65 op=LOAD Jan 19 09:25:43.315436 kernel: audit: type=1334 audit(1768814743.304:308): prog-id=65 op=LOAD Jan 19 09:25:43.304000 audit: BPF prog-id=46 op=UNLOAD Jan 19 09:25:43.304000 audit: BPF prog-id=66 op=LOAD Jan 19 09:25:43.304000 audit: BPF prog-id=67 op=LOAD Jan 19 09:25:43.304000 audit: BPF prog-id=47 op=UNLOAD Jan 19 09:25:43.304000 audit: BPF prog-id=48 op=UNLOAD Jan 19 09:25:43.308000 audit: BPF prog-id=68 op=LOAD Jan 19 09:25:43.308000 audit: BPF prog-id=43 op=UNLOAD Jan 19 09:25:43.308000 audit: BPF prog-id=69 op=LOAD Jan 19 09:25:43.308000 audit: BPF prog-id=70 op=LOAD Jan 19 09:25:43.308000 audit: BPF prog-id=44 op=UNLOAD Jan 19 09:25:43.308000 audit: BPF prog-id=45 op=UNLOAD Jan 19 09:25:43.311000 audit: BPF prog-id=71 op=LOAD Jan 19 09:25:43.311000 audit: BPF prog-id=49 op=UNLOAD Jan 19 09:25:43.311000 audit: BPF prog-id=72 op=LOAD Jan 19 09:25:43.311000 audit: BPF prog-id=73 op=LOAD Jan 19 09:25:43.311000 audit: BPF prog-id=50 op=UNLOAD Jan 19 09:25:43.311000 audit: BPF prog-id=51 op=UNLOAD Jan 19 09:25:43.314000 audit: BPF prog-id=74 op=LOAD Jan 19 09:25:43.314000 audit: BPF prog-id=57 op=UNLOAD Jan 19 09:25:43.315000 audit: BPF prog-id=75 op=LOAD Jan 19 09:25:43.318451 kernel: audit: type=1334 audit(1768814743.304:309): prog-id=46 op=UNLOAD Jan 19 09:25:43.320000 audit: BPF prog-id=54 op=UNLOAD Jan 19 09:25:43.320000 audit: BPF prog-id=76 op=LOAD Jan 19 09:25:43.320000 audit: BPF prog-id=77 op=LOAD Jan 19 09:25:43.320000 audit: BPF prog-id=55 op=UNLOAD Jan 19 09:25:43.320000 audit: BPF prog-id=56 op=UNLOAD Jan 19 09:25:43.322000 audit: BPF prog-id=78 op=LOAD Jan 19 09:25:43.322000 audit: BPF prog-id=60 op=UNLOAD Jan 19 09:25:43.322000 audit: BPF prog-id=79 op=LOAD Jan 19 09:25:43.322000 audit: BPF prog-id=80 op=LOAD Jan 19 09:25:43.322000 audit: BPF prog-id=61 op=UNLOAD Jan 19 09:25:43.322000 audit: BPF prog-id=62 op=UNLOAD Jan 19 09:25:43.322000 audit: BPF prog-id=81 op=LOAD Jan 19 09:25:43.322000 audit: BPF prog-id=82 op=LOAD Jan 19 09:25:43.322000 audit: BPF prog-id=52 op=UNLOAD Jan 19 09:25:43.322000 audit: BPF prog-id=53 op=UNLOAD Jan 19 09:25:43.346286 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 19 09:25:43.346401 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 19 09:25:43.346721 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:25:43.346000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 19 09:25:43.346803 systemd[1]: kubelet.service: Consumed 131ms CPU time, 98.4M memory peak. Jan 19 09:25:43.348439 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:25:43.518823 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:25:43.519000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:43.528867 (kubelet)[2552]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 19 09:25:43.570888 kubelet[2552]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 19 09:25:43.572428 kubelet[2552]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 19 09:25:43.572428 kubelet[2552]: I0119 09:25:43.571504 2552 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 19 09:25:44.080820 kubelet[2552]: I0119 09:25:44.080769 2552 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 19 09:25:44.080820 kubelet[2552]: I0119 09:25:44.080794 2552 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 19 09:25:44.080820 kubelet[2552]: I0119 09:25:44.080819 2552 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 19 09:25:44.080820 kubelet[2552]: I0119 09:25:44.080830 2552 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 19 09:25:44.081039 kubelet[2552]: I0119 09:25:44.081020 2552 server.go:956] "Client rotation is on, will bootstrap in background" Jan 19 09:25:44.085998 kubelet[2552]: E0119 09:25:44.085900 2552 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://77.42.19.180:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 77.42.19.180:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 19 09:25:44.087924 kubelet[2552]: I0119 09:25:44.087407 2552 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 19 09:25:44.090790 kubelet[2552]: I0119 09:25:44.090766 2552 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 19 09:25:44.093927 kubelet[2552]: I0119 09:25:44.093887 2552 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 19 09:25:44.094165 kubelet[2552]: I0119 09:25:44.094135 2552 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 19 09:25:44.094342 kubelet[2552]: I0119 09:25:44.094161 2552 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4580-0-0-p-637e605ed7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 19 09:25:44.094449 kubelet[2552]: I0119 09:25:44.094344 2552 topology_manager.go:138] "Creating topology manager with none policy" Jan 19 09:25:44.094449 kubelet[2552]: I0119 09:25:44.094353 2552 container_manager_linux.go:306] "Creating device plugin manager" Jan 19 09:25:44.094507 kubelet[2552]: I0119 09:25:44.094490 2552 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 19 09:25:44.096551 kubelet[2552]: I0119 09:25:44.096526 2552 state_mem.go:36] "Initialized new in-memory state store" Jan 19 09:25:44.097412 kubelet[2552]: I0119 09:25:44.096735 2552 kubelet.go:475] "Attempting to sync node with API server" Jan 19 09:25:44.097412 kubelet[2552]: I0119 09:25:44.096747 2552 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 19 09:25:44.097412 kubelet[2552]: I0119 09:25:44.096768 2552 kubelet.go:387] "Adding apiserver pod source" Jan 19 09:25:44.097412 kubelet[2552]: I0119 09:25:44.096793 2552 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 19 09:25:44.099112 kubelet[2552]: E0119 09:25:44.099079 2552 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://77.42.19.180:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 77.42.19.180:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 19 09:25:44.099166 kubelet[2552]: E0119 09:25:44.099142 2552 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://77.42.19.180:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4580-0-0-p-637e605ed7&limit=500&resourceVersion=0\": dial tcp 77.42.19.180:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 19 09:25:44.099503 kubelet[2552]: I0119 09:25:44.099487 2552 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 19 09:25:44.099854 kubelet[2552]: I0119 09:25:44.099839 2552 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 19 09:25:44.099903 kubelet[2552]: I0119 09:25:44.099863 2552 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 19 09:25:44.099990 kubelet[2552]: W0119 09:25:44.099904 2552 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 19 09:25:44.104104 kubelet[2552]: I0119 09:25:44.103785 2552 server.go:1262] "Started kubelet" Jan 19 09:25:44.104897 kubelet[2552]: I0119 09:25:44.104638 2552 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 19 09:25:44.110163 kubelet[2552]: E0119 09:25:44.109241 2552 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://77.42.19.180:6443/api/v1/namespaces/default/events\": dial tcp 77.42.19.180:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4580-0-0-p-637e605ed7.188c17a35fc7380a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4580-0-0-p-637e605ed7,UID:ci-4580-0-0-p-637e605ed7,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4580-0-0-p-637e605ed7,},FirstTimestamp:2026-01-19 09:25:44.10376193 +0000 UTC m=+0.570446797,LastTimestamp:2026-01-19 09:25:44.10376193 +0000 UTC m=+0.570446797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4580-0-0-p-637e605ed7,}" Jan 19 09:25:44.110721 kubelet[2552]: I0119 09:25:44.110699 2552 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 19 09:25:44.111000 audit[2565]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2565 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:44.111000 audit[2565]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffca84e5510 a2=0 a3=0 items=0 ppid=2552 pid=2565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.111000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 19 09:25:44.112269 kubelet[2552]: I0119 09:25:44.112251 2552 server.go:310] "Adding debug handlers to kubelet server" Jan 19 09:25:44.112000 audit[2566]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2566 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:44.112000 audit[2566]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdcfe39df0 a2=0 a3=0 items=0 ppid=2552 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.112000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 19 09:25:44.114170 kubelet[2552]: I0119 09:25:44.114159 2552 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 19 09:25:44.114374 kubelet[2552]: E0119 09:25:44.114363 2552 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4580-0-0-p-637e605ed7\" not found" Jan 19 09:25:44.115566 kubelet[2552]: I0119 09:25:44.115536 2552 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 19 09:25:44.115605 kubelet[2552]: I0119 09:25:44.115587 2552 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 19 09:25:44.115000 audit[2570]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2570 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:44.115000 audit[2570]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff687c40f0 a2=0 a3=0 items=0 ppid=2552 pid=2570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.115000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 19 09:25:44.115978 kubelet[2552]: I0119 09:25:44.115808 2552 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 19 09:25:44.116043 kubelet[2552]: I0119 09:25:44.116025 2552 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 19 09:25:44.117193 kubelet[2552]: I0119 09:25:44.117181 2552 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 19 09:25:44.117293 kubelet[2552]: I0119 09:25:44.117285 2552 reconciler.go:29] "Reconciler: start to sync state" Jan 19 09:25:44.118000 audit[2572]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2572 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:44.118000 audit[2572]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffcfc770fe0 a2=0 a3=0 items=0 ppid=2552 pid=2572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.118000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 19 09:25:44.119421 kubelet[2552]: E0119 09:25:44.119377 2552 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.19.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-637e605ed7?timeout=10s\": dial tcp 77.42.19.180:6443: connect: connection refused" interval="200ms" Jan 19 09:25:44.119774 kubelet[2552]: I0119 09:25:44.119755 2552 factory.go:223] Registration of the systemd container factory successfully Jan 19 09:25:44.119856 kubelet[2552]: I0119 09:25:44.119840 2552 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 19 09:25:44.121821 kubelet[2552]: E0119 09:25:44.121026 2552 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://77.42.19.180:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 77.42.19.180:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 19 09:25:44.121877 kubelet[2552]: E0119 09:25:44.121821 2552 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 19 09:25:44.121905 kubelet[2552]: I0119 09:25:44.121899 2552 factory.go:223] Registration of the containerd container factory successfully Jan 19 09:25:44.124000 audit[2575]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2575 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:44.124000 audit[2575]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffe7e819110 a2=0 a3=0 items=0 ppid=2552 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.124000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Jan 19 09:25:44.125548 kubelet[2552]: I0119 09:25:44.125522 2552 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 19 09:25:44.126000 audit[2576]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2576 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:44.126000 audit[2576]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff412ea250 a2=0 a3=0 items=0 ppid=2552 pid=2576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.126000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 19 09:25:44.127000 audit[2577]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2577 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:44.129000 kubelet[2552]: I0119 09:25:44.128499 2552 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 19 09:25:44.129000 kubelet[2552]: I0119 09:25:44.128518 2552 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 19 09:25:44.129000 kubelet[2552]: I0119 09:25:44.128550 2552 kubelet.go:2427] "Starting kubelet main sync loop" Jan 19 09:25:44.129000 kubelet[2552]: E0119 09:25:44.128588 2552 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 19 09:25:44.127000 audit[2577]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffccc756010 a2=0 a3=0 items=0 ppid=2552 pid=2577 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.127000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 19 09:25:44.131000 audit[2579]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2579 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:44.131000 audit[2579]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe63831df0 a2=0 a3=0 items=0 ppid=2552 pid=2579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.131000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 19 09:25:44.136000 audit[2580]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2580 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:44.136000 audit[2580]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd2dd53430 a2=0 a3=0 items=0 ppid=2552 pid=2580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.136000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 19 09:25:44.137000 audit[2581]: NETFILTER_CFG table=filter:51 family=10 entries=1 op=nft_register_chain pid=2581 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:44.137000 audit[2581]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdb5b8d5f0 a2=0 a3=0 items=0 ppid=2552 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.137000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 19 09:25:44.140056 kubelet[2552]: E0119 09:25:44.139947 2552 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://77.42.19.180:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 77.42.19.180:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 19 09:25:44.140000 audit[2583]: NETFILTER_CFG table=nat:52 family=2 entries=1 op=nft_register_chain pid=2583 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:44.140000 audit[2583]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc19f3dae0 a2=0 a3=0 items=0 ppid=2552 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.140000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 19 09:25:44.141000 audit[2584]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2584 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:44.141000 audit[2584]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff6ec1ffa0 a2=0 a3=0 items=0 ppid=2552 pid=2584 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:44.141000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 19 09:25:44.146844 kubelet[2552]: I0119 09:25:44.146829 2552 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 19 09:25:44.147057 kubelet[2552]: I0119 09:25:44.147046 2552 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 19 09:25:44.147112 kubelet[2552]: I0119 09:25:44.147107 2552 state_mem.go:36] "Initialized new in-memory state store" Jan 19 09:25:44.149182 kubelet[2552]: I0119 09:25:44.149008 2552 policy_none.go:49] "None policy: Start" Jan 19 09:25:44.149182 kubelet[2552]: I0119 09:25:44.149021 2552 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 19 09:25:44.149182 kubelet[2552]: I0119 09:25:44.149032 2552 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 19 09:25:44.150145 kubelet[2552]: I0119 09:25:44.150136 2552 policy_none.go:47] "Start" Jan 19 09:25:44.154712 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 19 09:25:44.166715 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 19 09:25:44.170636 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 19 09:25:44.183293 kubelet[2552]: E0119 09:25:44.183084 2552 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 19 09:25:44.184427 kubelet[2552]: I0119 09:25:44.184062 2552 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 19 09:25:44.184427 kubelet[2552]: I0119 09:25:44.184073 2552 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 19 09:25:44.185379 kubelet[2552]: I0119 09:25:44.185370 2552 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 19 09:25:44.186620 kubelet[2552]: E0119 09:25:44.186607 2552 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 19 09:25:44.186861 kubelet[2552]: E0119 09:25:44.186803 2552 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4580-0-0-p-637e605ed7\" not found" Jan 19 09:25:44.251158 systemd[1]: Created slice kubepods-burstable-pod3d364a27a1cfd279606e69f4863203b3.slice - libcontainer container kubepods-burstable-pod3d364a27a1cfd279606e69f4863203b3.slice. Jan 19 09:25:44.283137 kubelet[2552]: E0119 09:25:44.283072 2552 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-637e605ed7\" not found" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.287150 kubelet[2552]: I0119 09:25:44.287088 2552 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.288054 kubelet[2552]: E0119 09:25:44.288008 2552 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.19.180:6443/api/v1/nodes\": dial tcp 77.42.19.180:6443: connect: connection refused" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.291663 systemd[1]: Created slice kubepods-burstable-poda39fa377be8cdc70cc7b513f2f125238.slice - libcontainer container kubepods-burstable-poda39fa377be8cdc70cc7b513f2f125238.slice. Jan 19 09:25:44.304217 kubelet[2552]: E0119 09:25:44.304153 2552 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-637e605ed7\" not found" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.311599 systemd[1]: Created slice kubepods-burstable-pod4b2d307ab72047ae819ff0d101b25605.slice - libcontainer container kubepods-burstable-pod4b2d307ab72047ae819ff0d101b25605.slice. Jan 19 09:25:44.317242 kubelet[2552]: E0119 09:25:44.317190 2552 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-637e605ed7\" not found" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.319294 kubelet[2552]: I0119 09:25:44.319221 2552 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.319294 kubelet[2552]: I0119 09:25:44.319267 2552 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3d364a27a1cfd279606e69f4863203b3-kubeconfig\") pod \"kube-scheduler-ci-4580-0-0-p-637e605ed7\" (UID: \"3d364a27a1cfd279606e69f4863203b3\") " pod="kube-system/kube-scheduler-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.319294 kubelet[2552]: I0119 09:25:44.319297 2552 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a39fa377be8cdc70cc7b513f2f125238-ca-certs\") pod \"kube-apiserver-ci-4580-0-0-p-637e605ed7\" (UID: \"a39fa377be8cdc70cc7b513f2f125238\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.319294 kubelet[2552]: I0119 09:25:44.319320 2552 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a39fa377be8cdc70cc7b513f2f125238-k8s-certs\") pod \"kube-apiserver-ci-4580-0-0-p-637e605ed7\" (UID: \"a39fa377be8cdc70cc7b513f2f125238\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.320318 kubelet[2552]: I0119 09:25:44.319343 2552 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a39fa377be8cdc70cc7b513f2f125238-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4580-0-0-p-637e605ed7\" (UID: \"a39fa377be8cdc70cc7b513f2f125238\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.320318 kubelet[2552]: I0119 09:25:44.319369 2552 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-ca-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.320318 kubelet[2552]: I0119 09:25:44.319428 2552 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-flexvolume-dir\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.320318 kubelet[2552]: I0119 09:25:44.319453 2552 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-k8s-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.320318 kubelet[2552]: I0119 09:25:44.319478 2552 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-kubeconfig\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.320567 kubelet[2552]: E0119 09:25:44.320073 2552 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.19.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-637e605ed7?timeout=10s\": dial tcp 77.42.19.180:6443: connect: connection refused" interval="400ms" Jan 19 09:25:44.493485 kubelet[2552]: I0119 09:25:44.491706 2552 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.493485 kubelet[2552]: E0119 09:25:44.492213 2552 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.19.180:6443/api/v1/nodes\": dial tcp 77.42.19.180:6443: connect: connection refused" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.588336 containerd[1676]: time="2026-01-19T09:25:44.588283093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4580-0-0-p-637e605ed7,Uid:3d364a27a1cfd279606e69f4863203b3,Namespace:kube-system,Attempt:0,}" Jan 19 09:25:44.608182 containerd[1676]: time="2026-01-19T09:25:44.608126880Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4580-0-0-p-637e605ed7,Uid:a39fa377be8cdc70cc7b513f2f125238,Namespace:kube-system,Attempt:0,}" Jan 19 09:25:44.623438 containerd[1676]: time="2026-01-19T09:25:44.622781812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4580-0-0-p-637e605ed7,Uid:4b2d307ab72047ae819ff0d101b25605,Namespace:kube-system,Attempt:0,}" Jan 19 09:25:44.721955 kubelet[2552]: E0119 09:25:44.721814 2552 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.19.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-637e605ed7?timeout=10s\": dial tcp 77.42.19.180:6443: connect: connection refused" interval="800ms" Jan 19 09:25:44.895060 kubelet[2552]: I0119 09:25:44.895009 2552 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.895365 kubelet[2552]: E0119 09:25:44.895276 2552 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.19.180:6443/api/v1/nodes\": dial tcp 77.42.19.180:6443: connect: connection refused" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:44.952297 kubelet[2552]: E0119 09:25:44.952229 2552 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://77.42.19.180:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 77.42.19.180:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 19 09:25:45.094473 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2589541800.mount: Deactivated successfully. Jan 19 09:25:45.106820 containerd[1676]: time="2026-01-19T09:25:45.106731615Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 19 09:25:45.109606 containerd[1676]: time="2026-01-19T09:25:45.109562768Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 19 09:25:45.111543 containerd[1676]: time="2026-01-19T09:25:45.111475009Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 19 09:25:45.114541 containerd[1676]: time="2026-01-19T09:25:45.114485032Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 19 09:25:45.115764 containerd[1676]: time="2026-01-19T09:25:45.115722963Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 19 09:25:45.115844 containerd[1676]: time="2026-01-19T09:25:45.115803013Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 19 09:25:45.117004 containerd[1676]: time="2026-01-19T09:25:45.116957774Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 19 09:25:45.119017 containerd[1676]: time="2026-01-19T09:25:45.117267514Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 19 09:25:45.119017 containerd[1676]: time="2026-01-19T09:25:45.118553505Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 492.2899ms" Jan 19 09:25:45.119858 kubelet[2552]: E0119 09:25:45.119812 2552 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://77.42.19.180:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 77.42.19.180:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 19 09:25:45.120735 containerd[1676]: time="2026-01-19T09:25:45.120665637Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 529.908292ms" Jan 19 09:25:45.127453 containerd[1676]: time="2026-01-19T09:25:45.127218512Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 516.54395ms" Jan 19 09:25:45.151932 containerd[1676]: time="2026-01-19T09:25:45.151548693Z" level=info msg="connecting to shim 31992d85513ba8ca835eb6dabb2b0e4884baaec119beb6883d661ef94758149d" address="unix:///run/containerd/s/e919c13deb0ac45eeadce23447329831c7bd60a1fa7bdf5921cd2cbbe9829f31" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:25:45.170476 containerd[1676]: time="2026-01-19T09:25:45.170256748Z" level=info msg="connecting to shim d56cce1ea29d845bb3cd4a0cfe1cc111d146b1e361f848116e3745a8f22fe120" address="unix:///run/containerd/s/cea780955e301682bc3016340dafa5dece290eea84c26c619f2ce28192434b95" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:25:45.192559 containerd[1676]: time="2026-01-19T09:25:45.192492237Z" level=info msg="connecting to shim 211cd231f2b6f57bcfdadbda79edb525c13aeac284be0d79e1daa1af3af5e31e" address="unix:///run/containerd/s/cb2205fda6fb019989c5dc53d0eb58411e0621080fef30f3dbe63319f3a50315" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:25:45.215869 systemd[1]: Started cri-containerd-31992d85513ba8ca835eb6dabb2b0e4884baaec119beb6883d661ef94758149d.scope - libcontainer container 31992d85513ba8ca835eb6dabb2b0e4884baaec119beb6883d661ef94758149d. Jan 19 09:25:45.227671 systemd[1]: Started cri-containerd-d56cce1ea29d845bb3cd4a0cfe1cc111d146b1e361f848116e3745a8f22fe120.scope - libcontainer container d56cce1ea29d845bb3cd4a0cfe1cc111d146b1e361f848116e3745a8f22fe120. Jan 19 09:25:45.252681 systemd[1]: Started cri-containerd-211cd231f2b6f57bcfdadbda79edb525c13aeac284be0d79e1daa1af3af5e31e.scope - libcontainer container 211cd231f2b6f57bcfdadbda79edb525c13aeac284be0d79e1daa1af3af5e31e. Jan 19 09:25:45.259323 kernel: kauditd_printk_skb: 72 callbacks suppressed Jan 19 09:25:45.259547 kernel: audit: type=1334 audit(1768814745.255:358): prog-id=83 op=LOAD Jan 19 09:25:45.255000 audit: BPF prog-id=83 op=LOAD Jan 19 09:25:45.259000 audit: BPF prog-id=84 op=LOAD Jan 19 09:25:45.268954 kernel: audit: type=1334 audit(1768814745.259:359): prog-id=84 op=LOAD Jan 19 09:25:45.269053 kernel: audit: type=1300 audit(1768814745.259:359): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.259000 audit[2632]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.274689 kernel: audit: type=1327 audit(1768814745.259:359): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.259000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.276508 kernel: audit: type=1334 audit(1768814745.259:360): prog-id=84 op=UNLOAD Jan 19 09:25:45.259000 audit: BPF prog-id=84 op=UNLOAD Jan 19 09:25:45.281961 kernel: audit: type=1300 audit(1768814745.259:360): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.259000 audit[2632]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.286455 kernel: audit: type=1327 audit(1768814745.259:360): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.259000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.286583 kubelet[2552]: E0119 09:25:45.286014 2552 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://77.42.19.180:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4580-0-0-p-637e605ed7&limit=500&resourceVersion=0\": dial tcp 77.42.19.180:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 19 09:25:45.289346 kernel: audit: type=1334 audit(1768814745.259:361): prog-id=85 op=LOAD Jan 19 09:25:45.259000 audit: BPF prog-id=85 op=LOAD Jan 19 09:25:45.259000 audit[2632]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.300893 kernel: audit: type=1300 audit(1768814745.259:361): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.301023 kernel: audit: type=1327 audit(1768814745.259:361): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.259000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.259000 audit: BPF prog-id=86 op=LOAD Jan 19 09:25:45.259000 audit[2632]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.259000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.259000 audit: BPF prog-id=86 op=UNLOAD Jan 19 09:25:45.259000 audit[2632]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.259000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.259000 audit: BPF prog-id=85 op=UNLOAD Jan 19 09:25:45.259000 audit[2632]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.259000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.259000 audit: BPF prog-id=87 op=LOAD Jan 19 09:25:45.259000 audit[2632]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2618 pid=2632 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.259000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435366363653165613239643834356262336364346130636665316363 Jan 19 09:25:45.260000 audit: BPF prog-id=88 op=LOAD Jan 19 09:25:45.262000 audit: BPF prog-id=89 op=LOAD Jan 19 09:25:45.262000 audit[2628]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.262000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932643835353133626138636138333565623664616262326230 Jan 19 09:25:45.262000 audit: BPF prog-id=89 op=UNLOAD Jan 19 09:25:45.262000 audit[2628]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.262000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932643835353133626138636138333565623664616262326230 Jan 19 09:25:45.263000 audit: BPF prog-id=90 op=LOAD Jan 19 09:25:45.263000 audit[2628]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932643835353133626138636138333565623664616262326230 Jan 19 09:25:45.263000 audit: BPF prog-id=91 op=LOAD Jan 19 09:25:45.263000 audit[2628]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932643835353133626138636138333565623664616262326230 Jan 19 09:25:45.263000 audit: BPF prog-id=91 op=UNLOAD Jan 19 09:25:45.263000 audit[2628]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932643835353133626138636138333565623664616262326230 Jan 19 09:25:45.263000 audit: BPF prog-id=90 op=UNLOAD Jan 19 09:25:45.263000 audit[2628]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932643835353133626138636138333565623664616262326230 Jan 19 09:25:45.263000 audit: BPF prog-id=92 op=LOAD Jan 19 09:25:45.263000 audit[2628]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.263000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331393932643835353133626138636138333565623664616262326230 Jan 19 09:25:45.270000 audit: BPF prog-id=93 op=LOAD Jan 19 09:25:45.270000 audit: BPF prog-id=94 op=LOAD Jan 19 09:25:45.270000 audit[2672]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2645 pid=2672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.270000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231316364323331663262366635376263666461646264613739656462 Jan 19 09:25:45.271000 audit: BPF prog-id=94 op=UNLOAD Jan 19 09:25:45.271000 audit[2672]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.271000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231316364323331663262366635376263666461646264613739656462 Jan 19 09:25:45.271000 audit: BPF prog-id=95 op=LOAD Jan 19 09:25:45.271000 audit[2672]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2645 pid=2672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.271000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231316364323331663262366635376263666461646264613739656462 Jan 19 09:25:45.301000 audit: BPF prog-id=96 op=LOAD Jan 19 09:25:45.301000 audit[2672]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2645 pid=2672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231316364323331663262366635376263666461646264613739656462 Jan 19 09:25:45.301000 audit: BPF prog-id=96 op=UNLOAD Jan 19 09:25:45.301000 audit[2672]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231316364323331663262366635376263666461646264613739656462 Jan 19 09:25:45.301000 audit: BPF prog-id=95 op=UNLOAD Jan 19 09:25:45.301000 audit[2672]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231316364323331663262366635376263666461646264613739656462 Jan 19 09:25:45.301000 audit: BPF prog-id=97 op=LOAD Jan 19 09:25:45.301000 audit[2672]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2645 pid=2672 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.301000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231316364323331663262366635376263666461646264613739656462 Jan 19 09:25:45.338781 containerd[1676]: time="2026-01-19T09:25:45.338729369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4580-0-0-p-637e605ed7,Uid:4b2d307ab72047ae819ff0d101b25605,Namespace:kube-system,Attempt:0,} returns sandbox id \"d56cce1ea29d845bb3cd4a0cfe1cc111d146b1e361f848116e3745a8f22fe120\"" Jan 19 09:25:45.341963 containerd[1676]: time="2026-01-19T09:25:45.341854481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4580-0-0-p-637e605ed7,Uid:3d364a27a1cfd279606e69f4863203b3,Namespace:kube-system,Attempt:0,} returns sandbox id \"31992d85513ba8ca835eb6dabb2b0e4884baaec119beb6883d661ef94758149d\"" Jan 19 09:25:45.350183 containerd[1676]: time="2026-01-19T09:25:45.350112558Z" level=info msg="CreateContainer within sandbox \"d56cce1ea29d845bb3cd4a0cfe1cc111d146b1e361f848116e3745a8f22fe120\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 19 09:25:45.351536 containerd[1676]: time="2026-01-19T09:25:45.351468269Z" level=info msg="CreateContainer within sandbox \"31992d85513ba8ca835eb6dabb2b0e4884baaec119beb6883d661ef94758149d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 19 09:25:45.360000 containerd[1676]: time="2026-01-19T09:25:45.359948246Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4580-0-0-p-637e605ed7,Uid:a39fa377be8cdc70cc7b513f2f125238,Namespace:kube-system,Attempt:0,} returns sandbox id \"211cd231f2b6f57bcfdadbda79edb525c13aeac284be0d79e1daa1af3af5e31e\"" Jan 19 09:25:45.363504 containerd[1676]: time="2026-01-19T09:25:45.363450739Z" level=info msg="CreateContainer within sandbox \"211cd231f2b6f57bcfdadbda79edb525c13aeac284be0d79e1daa1af3af5e31e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 19 09:25:45.365803 containerd[1676]: time="2026-01-19T09:25:45.365747681Z" level=info msg="Container bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:25:45.380139 containerd[1676]: time="2026-01-19T09:25:45.379935673Z" level=info msg="CreateContainer within sandbox \"d56cce1ea29d845bb3cd4a0cfe1cc111d146b1e361f848116e3745a8f22fe120\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e\"" Jan 19 09:25:45.381524 containerd[1676]: time="2026-01-19T09:25:45.381484984Z" level=info msg="StartContainer for \"bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e\"" Jan 19 09:25:45.383193 containerd[1676]: time="2026-01-19T09:25:45.383149626Z" level=info msg="connecting to shim bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e" address="unix:///run/containerd/s/cea780955e301682bc3016340dafa5dece290eea84c26c619f2ce28192434b95" protocol=ttrpc version=3 Jan 19 09:25:45.383871 containerd[1676]: time="2026-01-19T09:25:45.383838146Z" level=info msg="Container 12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:25:45.402860 containerd[1676]: time="2026-01-19T09:25:45.402003931Z" level=info msg="CreateContainer within sandbox \"31992d85513ba8ca835eb6dabb2b0e4884baaec119beb6883d661ef94758149d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e\"" Jan 19 09:25:45.404226 containerd[1676]: time="2026-01-19T09:25:45.403629613Z" level=info msg="StartContainer for \"12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e\"" Jan 19 09:25:45.405387 containerd[1676]: time="2026-01-19T09:25:45.405330494Z" level=info msg="connecting to shim 12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e" address="unix:///run/containerd/s/e919c13deb0ac45eeadce23447329831c7bd60a1fa7bdf5921cd2cbbe9829f31" protocol=ttrpc version=3 Jan 19 09:25:45.411827 containerd[1676]: time="2026-01-19T09:25:45.411786139Z" level=info msg="Container 814470e2b9f6a80a1586ce9fbf95550d1e4226c67bfc2f6f30ea9962226c030f: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:25:45.418970 systemd[1]: Started cri-containerd-bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e.scope - libcontainer container bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e. Jan 19 09:25:45.439880 systemd[1]: Started cri-containerd-12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e.scope - libcontainer container 12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e. Jan 19 09:25:45.444000 audit: BPF prog-id=98 op=LOAD Jan 19 09:25:45.445000 audit: BPF prog-id=99 op=LOAD Jan 19 09:25:45.445000 audit[2730]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2618 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.445000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263663661323734326636643062386234316166313039623932313038 Jan 19 09:25:45.445000 audit: BPF prog-id=99 op=UNLOAD Jan 19 09:25:45.445000 audit[2730]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.445000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263663661323734326636643062386234316166313039623932313038 Jan 19 09:25:45.446762 containerd[1676]: time="2026-01-19T09:25:45.446384278Z" level=info msg="CreateContainer within sandbox \"211cd231f2b6f57bcfdadbda79edb525c13aeac284be0d79e1daa1af3af5e31e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"814470e2b9f6a80a1586ce9fbf95550d1e4226c67bfc2f6f30ea9962226c030f\"" Jan 19 09:25:45.447215 containerd[1676]: time="2026-01-19T09:25:45.447176259Z" level=info msg="StartContainer for \"814470e2b9f6a80a1586ce9fbf95550d1e4226c67bfc2f6f30ea9962226c030f\"" Jan 19 09:25:45.447000 audit: BPF prog-id=100 op=LOAD Jan 19 09:25:45.447000 audit[2730]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2618 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.447000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263663661323734326636643062386234316166313039623932313038 Jan 19 09:25:45.447000 audit: BPF prog-id=101 op=LOAD Jan 19 09:25:45.447000 audit[2730]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2618 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.447000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263663661323734326636643062386234316166313039623932313038 Jan 19 09:25:45.447000 audit: BPF prog-id=101 op=UNLOAD Jan 19 09:25:45.447000 audit[2730]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.447000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263663661323734326636643062386234316166313039623932313038 Jan 19 09:25:45.447000 audit: BPF prog-id=100 op=UNLOAD Jan 19 09:25:45.447000 audit[2730]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.447000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263663661323734326636643062386234316166313039623932313038 Jan 19 09:25:45.447000 audit: BPF prog-id=102 op=LOAD Jan 19 09:25:45.447000 audit[2730]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2618 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.447000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263663661323734326636643062386234316166313039623932313038 Jan 19 09:25:45.449792 containerd[1676]: time="2026-01-19T09:25:45.449736031Z" level=info msg="connecting to shim 814470e2b9f6a80a1586ce9fbf95550d1e4226c67bfc2f6f30ea9962226c030f" address="unix:///run/containerd/s/cb2205fda6fb019989c5dc53d0eb58411e0621080fef30f3dbe63319f3a50315" protocol=ttrpc version=3 Jan 19 09:25:45.470614 systemd[1]: Started cri-containerd-814470e2b9f6a80a1586ce9fbf95550d1e4226c67bfc2f6f30ea9962226c030f.scope - libcontainer container 814470e2b9f6a80a1586ce9fbf95550d1e4226c67bfc2f6f30ea9962226c030f. Jan 19 09:25:45.483000 audit: BPF prog-id=103 op=LOAD Jan 19 09:25:45.484000 audit: BPF prog-id=104 op=LOAD Jan 19 09:25:45.484000 audit[2741]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000228238 a2=98 a3=0 items=0 ppid=2599 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132393738363136303535656662616334346365653831643133623036 Jan 19 09:25:45.484000 audit: BPF prog-id=104 op=UNLOAD Jan 19 09:25:45.484000 audit[2741]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132393738363136303535656662616334346365653831643133623036 Jan 19 09:25:45.484000 audit: BPF prog-id=105 op=LOAD Jan 19 09:25:45.484000 audit[2741]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000228488 a2=98 a3=0 items=0 ppid=2599 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132393738363136303535656662616334346365653831643133623036 Jan 19 09:25:45.484000 audit: BPF prog-id=106 op=LOAD Jan 19 09:25:45.484000 audit[2741]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000228218 a2=98 a3=0 items=0 ppid=2599 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132393738363136303535656662616334346365653831643133623036 Jan 19 09:25:45.484000 audit: BPF prog-id=106 op=UNLOAD Jan 19 09:25:45.484000 audit[2741]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132393738363136303535656662616334346365653831643133623036 Jan 19 09:25:45.484000 audit: BPF prog-id=105 op=UNLOAD Jan 19 09:25:45.484000 audit[2741]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132393738363136303535656662616334346365653831643133623036 Jan 19 09:25:45.484000 audit: BPF prog-id=107 op=LOAD Jan 19 09:25:45.484000 audit[2741]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0002286e8 a2=98 a3=0 items=0 ppid=2599 pid=2741 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132393738363136303535656662616334346365653831643133623036 Jan 19 09:25:45.500000 audit: BPF prog-id=108 op=LOAD Jan 19 09:25:45.501000 audit: BPF prog-id=109 op=LOAD Jan 19 09:25:45.501000 audit[2767]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2645 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343437306532623966366138306131353836636539666266393535 Jan 19 09:25:45.501000 audit: BPF prog-id=109 op=UNLOAD Jan 19 09:25:45.501000 audit[2767]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343437306532623966366138306131353836636539666266393535 Jan 19 09:25:45.501000 audit: BPF prog-id=110 op=LOAD Jan 19 09:25:45.501000 audit[2767]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2645 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343437306532623966366138306131353836636539666266393535 Jan 19 09:25:45.501000 audit: BPF prog-id=111 op=LOAD Jan 19 09:25:45.501000 audit[2767]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2645 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343437306532623966366138306131353836636539666266393535 Jan 19 09:25:45.501000 audit: BPF prog-id=111 op=UNLOAD Jan 19 09:25:45.501000 audit[2767]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343437306532623966366138306131353836636539666266393535 Jan 19 09:25:45.501000 audit: BPF prog-id=110 op=UNLOAD Jan 19 09:25:45.501000 audit[2767]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343437306532623966366138306131353836636539666266393535 Jan 19 09:25:45.501000 audit: BPF prog-id=112 op=LOAD Jan 19 09:25:45.501000 audit[2767]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2645 pid=2767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:45.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343437306532623966366138306131353836636539666266393535 Jan 19 09:25:45.520717 containerd[1676]: time="2026-01-19T09:25:45.520666920Z" level=info msg="StartContainer for \"bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e\" returns successfully" Jan 19 09:25:45.524137 kubelet[2552]: E0119 09:25:45.524083 2552 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.19.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-637e605ed7?timeout=10s\": dial tcp 77.42.19.180:6443: connect: connection refused" interval="1.6s" Jan 19 09:25:45.524526 kubelet[2552]: E0119 09:25:45.524300 2552 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://77.42.19.180:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 77.42.19.180:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 19 09:25:45.552430 containerd[1676]: time="2026-01-19T09:25:45.552348646Z" level=info msg="StartContainer for \"12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e\" returns successfully" Jan 19 09:25:45.588618 containerd[1676]: time="2026-01-19T09:25:45.587675906Z" level=info msg="StartContainer for \"814470e2b9f6a80a1586ce9fbf95550d1e4226c67bfc2f6f30ea9962226c030f\" returns successfully" Jan 19 09:25:45.699447 kubelet[2552]: I0119 09:25:45.699326 2552 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:46.171679 kubelet[2552]: E0119 09:25:46.170498 2552 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-637e605ed7\" not found" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:46.172988 kubelet[2552]: E0119 09:25:46.172958 2552 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-637e605ed7\" not found" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:46.178676 kubelet[2552]: E0119 09:25:46.178635 2552 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-637e605ed7\" not found" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.188421 kubelet[2552]: E0119 09:25:47.186498 2552 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-637e605ed7\" not found" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.188421 kubelet[2552]: E0119 09:25:47.186899 2552 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-637e605ed7\" not found" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.304048 kubelet[2552]: E0119 09:25:47.304013 2552 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4580-0-0-p-637e605ed7\" not found" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.416209 kubelet[2552]: I0119 09:25:47.415575 2552 kubelet_node_status.go:78] "Successfully registered node" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.416209 kubelet[2552]: E0119 09:25:47.415641 2552 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4580-0-0-p-637e605ed7\": node \"ci-4580-0-0-p-637e605ed7\" not found" Jan 19 09:25:47.515686 kubelet[2552]: I0119 09:25:47.515553 2552 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.523612 kubelet[2552]: E0119 09:25:47.523422 2552 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4580-0-0-p-637e605ed7\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.523612 kubelet[2552]: I0119 09:25:47.523453 2552 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.526347 kubelet[2552]: E0119 09:25:47.526190 2552 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4580-0-0-p-637e605ed7\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.526347 kubelet[2552]: I0119 09:25:47.526215 2552 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:47.528105 kubelet[2552]: E0119 09:25:47.528076 2552 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:48.105277 kubelet[2552]: I0119 09:25:48.105220 2552 apiserver.go:52] "Watching apiserver" Jan 19 09:25:48.118111 kubelet[2552]: I0119 09:25:48.118039 2552 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 19 09:25:48.182141 kubelet[2552]: I0119 09:25:48.182077 2552 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:49.483481 systemd[1]: Reload requested from client PID 2829 ('systemctl') (unit session-8.scope)... Jan 19 09:25:49.483500 systemd[1]: Reloading... Jan 19 09:25:49.568494 zram_generator::config[2876]: No configuration found. Jan 19 09:25:49.832102 systemd[1]: Reloading finished in 348 ms. Jan 19 09:25:49.866003 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:25:49.887219 systemd[1]: kubelet.service: Deactivated successfully. Jan 19 09:25:49.887620 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:25:49.887000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:49.887702 systemd[1]: kubelet.service: Consumed 1.043s CPU time, 123.7M memory peak. Jan 19 09:25:49.891000 audit: BPF prog-id=113 op=LOAD Jan 19 09:25:49.891000 audit: BPF prog-id=68 op=UNLOAD Jan 19 09:25:49.891159 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 19 09:25:49.892000 audit: BPF prog-id=114 op=LOAD Jan 19 09:25:49.892000 audit: BPF prog-id=115 op=LOAD Jan 19 09:25:49.892000 audit: BPF prog-id=69 op=UNLOAD Jan 19 09:25:49.892000 audit: BPF prog-id=70 op=UNLOAD Jan 19 09:25:49.895000 audit: BPF prog-id=116 op=LOAD Jan 19 09:25:49.895000 audit: BPF prog-id=78 op=UNLOAD Jan 19 09:25:49.895000 audit: BPF prog-id=117 op=LOAD Jan 19 09:25:49.895000 audit: BPF prog-id=118 op=LOAD Jan 19 09:25:49.895000 audit: BPF prog-id=79 op=UNLOAD Jan 19 09:25:49.895000 audit: BPF prog-id=80 op=UNLOAD Jan 19 09:25:49.896000 audit: BPF prog-id=119 op=LOAD Jan 19 09:25:49.896000 audit: BPF prog-id=71 op=UNLOAD Jan 19 09:25:49.896000 audit: BPF prog-id=120 op=LOAD Jan 19 09:25:49.896000 audit: BPF prog-id=121 op=LOAD Jan 19 09:25:49.896000 audit: BPF prog-id=72 op=UNLOAD Jan 19 09:25:49.896000 audit: BPF prog-id=73 op=UNLOAD Jan 19 09:25:49.897000 audit: BPF prog-id=122 op=LOAD Jan 19 09:25:49.904000 audit: BPF prog-id=65 op=UNLOAD Jan 19 09:25:49.904000 audit: BPF prog-id=123 op=LOAD Jan 19 09:25:49.904000 audit: BPF prog-id=124 op=LOAD Jan 19 09:25:49.904000 audit: BPF prog-id=66 op=UNLOAD Jan 19 09:25:49.904000 audit: BPF prog-id=67 op=UNLOAD Jan 19 09:25:49.905000 audit: BPF prog-id=125 op=LOAD Jan 19 09:25:49.905000 audit: BPF prog-id=74 op=UNLOAD Jan 19 09:25:49.906000 audit: BPF prog-id=126 op=LOAD Jan 19 09:25:49.906000 audit: BPF prog-id=63 op=UNLOAD Jan 19 09:25:49.906000 audit: BPF prog-id=127 op=LOAD Jan 19 09:25:49.906000 audit: BPF prog-id=75 op=UNLOAD Jan 19 09:25:49.906000 audit: BPF prog-id=128 op=LOAD Jan 19 09:25:49.906000 audit: BPF prog-id=129 op=LOAD Jan 19 09:25:49.906000 audit: BPF prog-id=76 op=UNLOAD Jan 19 09:25:49.906000 audit: BPF prog-id=77 op=UNLOAD Jan 19 09:25:49.907000 audit: BPF prog-id=130 op=LOAD Jan 19 09:25:49.907000 audit: BPF prog-id=131 op=LOAD Jan 19 09:25:49.907000 audit: BPF prog-id=81 op=UNLOAD Jan 19 09:25:49.907000 audit: BPF prog-id=82 op=UNLOAD Jan 19 09:25:49.908000 audit: BPF prog-id=132 op=LOAD Jan 19 09:25:49.908000 audit: BPF prog-id=64 op=UNLOAD Jan 19 09:25:50.078666 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 19 09:25:50.078000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:25:50.094872 (kubelet)[2926]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 19 09:25:50.151452 kubelet[2926]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 19 09:25:50.151452 kubelet[2926]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 19 09:25:50.151893 kubelet[2926]: I0119 09:25:50.151489 2926 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 19 09:25:50.157948 kubelet[2926]: I0119 09:25:50.157915 2926 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 19 09:25:50.157948 kubelet[2926]: I0119 09:25:50.157937 2926 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 19 09:25:50.158065 kubelet[2926]: I0119 09:25:50.157965 2926 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 19 09:25:50.158065 kubelet[2926]: I0119 09:25:50.157977 2926 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 19 09:25:50.160413 kubelet[2926]: I0119 09:25:50.158517 2926 server.go:956] "Client rotation is on, will bootstrap in background" Jan 19 09:25:50.160413 kubelet[2926]: I0119 09:25:50.159978 2926 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 19 09:25:50.161768 kubelet[2926]: I0119 09:25:50.161753 2926 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 19 09:25:50.164476 kubelet[2926]: I0119 09:25:50.164453 2926 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 19 09:25:50.168705 kubelet[2926]: I0119 09:25:50.168676 2926 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 19 09:25:50.169004 kubelet[2926]: I0119 09:25:50.168970 2926 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 19 09:25:50.169152 kubelet[2926]: I0119 09:25:50.169001 2926 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4580-0-0-p-637e605ed7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 19 09:25:50.169152 kubelet[2926]: I0119 09:25:50.169149 2926 topology_manager.go:138] "Creating topology manager with none policy" Jan 19 09:25:50.169229 kubelet[2926]: I0119 09:25:50.169159 2926 container_manager_linux.go:306] "Creating device plugin manager" Jan 19 09:25:50.169229 kubelet[2926]: I0119 09:25:50.169183 2926 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 19 09:25:50.170001 kubelet[2926]: I0119 09:25:50.169981 2926 state_mem.go:36] "Initialized new in-memory state store" Jan 19 09:25:50.170193 kubelet[2926]: I0119 09:25:50.170176 2926 kubelet.go:475] "Attempting to sync node with API server" Jan 19 09:25:50.170193 kubelet[2926]: I0119 09:25:50.170191 2926 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 19 09:25:50.170223 kubelet[2926]: I0119 09:25:50.170212 2926 kubelet.go:387] "Adding apiserver pod source" Jan 19 09:25:50.172445 kubelet[2926]: I0119 09:25:50.172425 2926 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 19 09:25:50.173872 kubelet[2926]: I0119 09:25:50.173849 2926 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 19 09:25:50.174478 kubelet[2926]: I0119 09:25:50.174462 2926 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 19 09:25:50.174532 kubelet[2926]: I0119 09:25:50.174502 2926 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 19 09:25:50.179498 kubelet[2926]: I0119 09:25:50.178113 2926 server.go:1262] "Started kubelet" Jan 19 09:25:50.180939 kubelet[2926]: I0119 09:25:50.180918 2926 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 19 09:25:50.194765 kubelet[2926]: I0119 09:25:50.194458 2926 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 19 09:25:50.195865 kubelet[2926]: I0119 09:25:50.195469 2926 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 19 09:25:50.195865 kubelet[2926]: I0119 09:25:50.195510 2926 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 19 09:25:50.195865 kubelet[2926]: I0119 09:25:50.195705 2926 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 19 09:25:50.201597 kubelet[2926]: I0119 09:25:50.201582 2926 server.go:310] "Adding debug handlers to kubelet server" Jan 19 09:25:50.202379 kubelet[2926]: I0119 09:25:50.202349 2926 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 19 09:25:50.203499 kubelet[2926]: I0119 09:25:50.203353 2926 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 19 09:25:50.205689 kubelet[2926]: I0119 09:25:50.205045 2926 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 19 09:25:50.207907 kubelet[2926]: I0119 09:25:50.207727 2926 reconciler.go:29] "Reconciler: start to sync state" Jan 19 09:25:50.210637 kubelet[2926]: I0119 09:25:50.210616 2926 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 19 09:25:50.212497 kubelet[2926]: E0119 09:25:50.212481 2926 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 19 09:25:50.214190 kubelet[2926]: I0119 09:25:50.214176 2926 factory.go:223] Registration of the containerd container factory successfully Jan 19 09:25:50.214259 kubelet[2926]: I0119 09:25:50.214253 2926 factory.go:223] Registration of the systemd container factory successfully Jan 19 09:25:50.221641 kubelet[2926]: I0119 09:25:50.221599 2926 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 19 09:25:50.223476 kubelet[2926]: I0119 09:25:50.223142 2926 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 19 09:25:50.223476 kubelet[2926]: I0119 09:25:50.223162 2926 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 19 09:25:50.223476 kubelet[2926]: I0119 09:25:50.223207 2926 kubelet.go:2427] "Starting kubelet main sync loop" Jan 19 09:25:50.224299 kubelet[2926]: E0119 09:25:50.223255 2926 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 19 09:25:50.267510 kubelet[2926]: I0119 09:25:50.267481 2926 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 19 09:25:50.267510 kubelet[2926]: I0119 09:25:50.267494 2926 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 19 09:25:50.267510 kubelet[2926]: I0119 09:25:50.267511 2926 state_mem.go:36] "Initialized new in-memory state store" Jan 19 09:25:50.267850 kubelet[2926]: I0119 09:25:50.267835 2926 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 19 09:25:50.267879 kubelet[2926]: I0119 09:25:50.267846 2926 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 19 09:25:50.267879 kubelet[2926]: I0119 09:25:50.267860 2926 policy_none.go:49] "None policy: Start" Jan 19 09:25:50.267879 kubelet[2926]: I0119 09:25:50.267869 2926 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 19 09:25:50.268218 kubelet[2926]: I0119 09:25:50.267879 2926 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 19 09:25:50.268218 kubelet[2926]: I0119 09:25:50.267949 2926 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Jan 19 09:25:50.268218 kubelet[2926]: I0119 09:25:50.267954 2926 policy_none.go:47] "Start" Jan 19 09:25:50.272506 kubelet[2926]: E0119 09:25:50.272473 2926 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 19 09:25:50.272655 kubelet[2926]: I0119 09:25:50.272644 2926 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 19 09:25:50.272689 kubelet[2926]: I0119 09:25:50.272657 2926 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 19 09:25:50.272910 kubelet[2926]: I0119 09:25:50.272889 2926 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 19 09:25:50.274788 kubelet[2926]: E0119 09:25:50.274775 2926 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 19 09:25:50.326102 kubelet[2926]: I0119 09:25:50.326057 2926 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.326897 kubelet[2926]: I0119 09:25:50.326516 2926 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.326897 kubelet[2926]: I0119 09:25:50.326758 2926 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.336568 kubelet[2926]: E0119 09:25:50.336487 2926 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4580-0-0-p-637e605ed7\" already exists" pod="kube-system/kube-scheduler-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.384130 kubelet[2926]: I0119 09:25:50.382704 2926 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.392793 kubelet[2926]: I0119 09:25:50.392501 2926 kubelet_node_status.go:124] "Node was previously registered" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.392793 kubelet[2926]: I0119 09:25:50.392588 2926 kubelet_node_status.go:78] "Successfully registered node" node="ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.408667 kubelet[2926]: I0119 09:25:50.408635 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a39fa377be8cdc70cc7b513f2f125238-k8s-certs\") pod \"kube-apiserver-ci-4580-0-0-p-637e605ed7\" (UID: \"a39fa377be8cdc70cc7b513f2f125238\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.408970 kubelet[2926]: I0119 09:25:50.408829 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a39fa377be8cdc70cc7b513f2f125238-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4580-0-0-p-637e605ed7\" (UID: \"a39fa377be8cdc70cc7b513f2f125238\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.408970 kubelet[2926]: I0119 09:25:50.408855 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-ca-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.408970 kubelet[2926]: I0119 09:25:50.408868 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-flexvolume-dir\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.408970 kubelet[2926]: I0119 09:25:50.408881 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.408970 kubelet[2926]: I0119 09:25:50.408894 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-k8s-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.409126 kubelet[2926]: I0119 09:25:50.408911 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4b2d307ab72047ae819ff0d101b25605-kubeconfig\") pod \"kube-controller-manager-ci-4580-0-0-p-637e605ed7\" (UID: \"4b2d307ab72047ae819ff0d101b25605\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.409126 kubelet[2926]: I0119 09:25:50.408928 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3d364a27a1cfd279606e69f4863203b3-kubeconfig\") pod \"kube-scheduler-ci-4580-0-0-p-637e605ed7\" (UID: \"3d364a27a1cfd279606e69f4863203b3\") " pod="kube-system/kube-scheduler-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:50.409126 kubelet[2926]: I0119 09:25:50.408946 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a39fa377be8cdc70cc7b513f2f125238-ca-certs\") pod \"kube-apiserver-ci-4580-0-0-p-637e605ed7\" (UID: \"a39fa377be8cdc70cc7b513f2f125238\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" Jan 19 09:25:51.173444 kubelet[2926]: I0119 09:25:51.173103 2926 apiserver.go:52] "Watching apiserver" Jan 19 09:25:51.205340 kubelet[2926]: I0119 09:25:51.205288 2926 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 19 09:25:51.289331 kubelet[2926]: I0119 09:25:51.288867 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-637e605ed7" podStartSLOduration=1.288845329 podStartE2EDuration="1.288845329s" podCreationTimestamp="2026-01-19 09:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-19 09:25:51.278284064 +0000 UTC m=+1.175102835" watchObservedRunningTime="2026-01-19 09:25:51.288845329 +0000 UTC m=+1.185664090" Jan 19 09:25:51.304308 kubelet[2926]: I0119 09:25:51.303851 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4580-0-0-p-637e605ed7" podStartSLOduration=3.303818691 podStartE2EDuration="3.303818691s" podCreationTimestamp="2026-01-19 09:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-19 09:25:51.289927554 +0000 UTC m=+1.186746315" watchObservedRunningTime="2026-01-19 09:25:51.303818691 +0000 UTC m=+1.200637442" Jan 19 09:25:51.319914 kubelet[2926]: I0119 09:25:51.319222 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4580-0-0-p-637e605ed7" podStartSLOduration=1.319204335 podStartE2EDuration="1.319204335s" podCreationTimestamp="2026-01-19 09:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-19 09:25:51.30495155 +0000 UTC m=+1.201770311" watchObservedRunningTime="2026-01-19 09:25:51.319204335 +0000 UTC m=+1.216023096" Jan 19 09:25:56.650839 kubelet[2926]: I0119 09:25:56.650788 2926 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 19 09:25:56.651491 containerd[1676]: time="2026-01-19T09:25:56.651317945Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 19 09:25:56.651710 kubelet[2926]: I0119 09:25:56.651641 2926 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 19 09:25:57.430745 systemd[1]: Created slice kubepods-besteffort-pod17d50688_4e48_4242_a3e1_fcaed0c94231.slice - libcontainer container kubepods-besteffort-pod17d50688_4e48_4242_a3e1_fcaed0c94231.slice. Jan 19 09:25:57.455535 kubelet[2926]: I0119 09:25:57.455439 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htdc7\" (UniqueName: \"kubernetes.io/projected/17d50688-4e48-4242-a3e1-fcaed0c94231-kube-api-access-htdc7\") pod \"kube-proxy-rfg2s\" (UID: \"17d50688-4e48-4242-a3e1-fcaed0c94231\") " pod="kube-system/kube-proxy-rfg2s" Jan 19 09:25:57.455535 kubelet[2926]: I0119 09:25:57.455509 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/17d50688-4e48-4242-a3e1-fcaed0c94231-xtables-lock\") pod \"kube-proxy-rfg2s\" (UID: \"17d50688-4e48-4242-a3e1-fcaed0c94231\") " pod="kube-system/kube-proxy-rfg2s" Jan 19 09:25:57.455751 kubelet[2926]: I0119 09:25:57.455542 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/17d50688-4e48-4242-a3e1-fcaed0c94231-kube-proxy\") pod \"kube-proxy-rfg2s\" (UID: \"17d50688-4e48-4242-a3e1-fcaed0c94231\") " pod="kube-system/kube-proxy-rfg2s" Jan 19 09:25:57.455751 kubelet[2926]: I0119 09:25:57.455599 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/17d50688-4e48-4242-a3e1-fcaed0c94231-lib-modules\") pod \"kube-proxy-rfg2s\" (UID: \"17d50688-4e48-4242-a3e1-fcaed0c94231\") " pod="kube-system/kube-proxy-rfg2s" Jan 19 09:25:57.749185 containerd[1676]: time="2026-01-19T09:25:57.749028935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rfg2s,Uid:17d50688-4e48-4242-a3e1-fcaed0c94231,Namespace:kube-system,Attempt:0,}" Jan 19 09:25:57.780738 containerd[1676]: time="2026-01-19T09:25:57.780660031Z" level=info msg="connecting to shim be27f565a9f112aa07555020aedc8044ed9373546d4e23b5b0b72627260791eb" address="unix:///run/containerd/s/6cbd86433d29be0ed9f6aadb1c4534831e7e32e7d109c2e5eec2e8694c7de763" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:25:57.849999 systemd[1]: Started cri-containerd-be27f565a9f112aa07555020aedc8044ed9373546d4e23b5b0b72627260791eb.scope - libcontainer container be27f565a9f112aa07555020aedc8044ed9373546d4e23b5b0b72627260791eb. Jan 19 09:25:57.892432 systemd[1]: Created slice kubepods-besteffort-pod7d1cbfd7_8a3f_4d2d_a0ff_0d5c26a16927.slice - libcontainer container kubepods-besteffort-pod7d1cbfd7_8a3f_4d2d_a0ff_0d5c26a16927.slice. Jan 19 09:25:57.901723 kernel: kauditd_printk_skb: 164 callbacks suppressed Jan 19 09:25:57.901798 kernel: audit: type=1334 audit(1768814757.897:448): prog-id=133 op=LOAD Jan 19 09:25:57.897000 audit: BPF prog-id=133 op=LOAD Jan 19 09:25:57.904636 kernel: audit: type=1334 audit(1768814757.897:449): prog-id=134 op=LOAD Jan 19 09:25:57.897000 audit: BPF prog-id=134 op=LOAD Jan 19 09:25:57.908447 kernel: audit: type=1300 audit(1768814757.897:449): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.897000 audit[2994]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.919959 kernel: audit: type=1327 audit(1768814757.897:449): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.897000 audit: BPF prog-id=134 op=UNLOAD Jan 19 09:25:57.929576 kernel: audit: type=1334 audit(1768814757.897:450): prog-id=134 op=UNLOAD Jan 19 09:25:57.929618 kernel: audit: type=1300 audit(1768814757.897:450): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.897000 audit[2994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.935447 kernel: audit: type=1327 audit(1768814757.897:450): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.898000 audit: BPF prog-id=135 op=LOAD Jan 19 09:25:57.938138 kernel: audit: type=1334 audit(1768814757.898:451): prog-id=135 op=LOAD Jan 19 09:25:57.898000 audit[2994]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.941791 kernel: audit: type=1300 audit(1768814757.898:451): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.945463 containerd[1676]: time="2026-01-19T09:25:57.945306301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rfg2s,Uid:17d50688-4e48-4242-a3e1-fcaed0c94231,Namespace:kube-system,Attempt:0,} returns sandbox id \"be27f565a9f112aa07555020aedc8044ed9373546d4e23b5b0b72627260791eb\"" Jan 19 09:25:57.898000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.949262 kernel: audit: type=1327 audit(1768814757.898:451): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.898000 audit: BPF prog-id=136 op=LOAD Jan 19 09:25:57.898000 audit[2994]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.898000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.898000 audit: BPF prog-id=136 op=UNLOAD Jan 19 09:25:57.898000 audit[2994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.898000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.898000 audit: BPF prog-id=135 op=UNLOAD Jan 19 09:25:57.898000 audit[2994]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.898000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.898000 audit: BPF prog-id=137 op=LOAD Jan 19 09:25:57.898000 audit[2994]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2982 pid=2994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:57.898000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6265323766353635613966313132616130373535353032306165646338 Jan 19 09:25:57.957878 containerd[1676]: time="2026-01-19T09:25:57.957307954Z" level=info msg="CreateContainer within sandbox \"be27f565a9f112aa07555020aedc8044ed9373546d4e23b5b0b72627260791eb\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 19 09:25:57.958518 kubelet[2926]: I0119 09:25:57.958482 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrlzf\" (UniqueName: \"kubernetes.io/projected/7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927-kube-api-access-zrlzf\") pod \"tigera-operator-65cdcdfd6d-b5zsr\" (UID: \"7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-b5zsr" Jan 19 09:25:57.958784 kubelet[2926]: I0119 09:25:57.958587 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-b5zsr\" (UID: \"7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-b5zsr" Jan 19 09:25:57.970085 containerd[1676]: time="2026-01-19T09:25:57.970051905Z" level=info msg="Container 561736982efaa7783634602826bd58d1c2235565125a48ba4b7829e2b32bfd81: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:25:57.977182 containerd[1676]: time="2026-01-19T09:25:57.977149285Z" level=info msg="CreateContainer within sandbox \"be27f565a9f112aa07555020aedc8044ed9373546d4e23b5b0b72627260791eb\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"561736982efaa7783634602826bd58d1c2235565125a48ba4b7829e2b32bfd81\"" Jan 19 09:25:57.978660 containerd[1676]: time="2026-01-19T09:25:57.978630417Z" level=info msg="StartContainer for \"561736982efaa7783634602826bd58d1c2235565125a48ba4b7829e2b32bfd81\"" Jan 19 09:25:57.980999 containerd[1676]: time="2026-01-19T09:25:57.980932638Z" level=info msg="connecting to shim 561736982efaa7783634602826bd58d1c2235565125a48ba4b7829e2b32bfd81" address="unix:///run/containerd/s/6cbd86433d29be0ed9f6aadb1c4534831e7e32e7d109c2e5eec2e8694c7de763" protocol=ttrpc version=3 Jan 19 09:25:58.017635 systemd[1]: Started cri-containerd-561736982efaa7783634602826bd58d1c2235565125a48ba4b7829e2b32bfd81.scope - libcontainer container 561736982efaa7783634602826bd58d1c2235565125a48ba4b7829e2b32bfd81. Jan 19 09:25:58.099000 audit: BPF prog-id=138 op=LOAD Jan 19 09:25:58.099000 audit[3019]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2982 pid=3019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.099000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536313733363938326566616137373833363334363032383236626435 Jan 19 09:25:58.099000 audit: BPF prog-id=139 op=LOAD Jan 19 09:25:58.099000 audit[3019]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2982 pid=3019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.099000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536313733363938326566616137373833363334363032383236626435 Jan 19 09:25:58.099000 audit: BPF prog-id=139 op=UNLOAD Jan 19 09:25:58.099000 audit[3019]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2982 pid=3019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.099000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536313733363938326566616137373833363334363032383236626435 Jan 19 09:25:58.100000 audit: BPF prog-id=138 op=UNLOAD Jan 19 09:25:58.100000 audit[3019]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2982 pid=3019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.100000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536313733363938326566616137373833363334363032383236626435 Jan 19 09:25:58.100000 audit: BPF prog-id=140 op=LOAD Jan 19 09:25:58.100000 audit[3019]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2982 pid=3019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.100000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536313733363938326566616137373833363334363032383236626435 Jan 19 09:25:58.142705 containerd[1676]: time="2026-01-19T09:25:58.142606670Z" level=info msg="StartContainer for \"561736982efaa7783634602826bd58d1c2235565125a48ba4b7829e2b32bfd81\" returns successfully" Jan 19 09:25:58.202316 containerd[1676]: time="2026-01-19T09:25:58.201994167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-b5zsr,Uid:7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927,Namespace:tigera-operator,Attempt:0,}" Jan 19 09:25:58.241849 containerd[1676]: time="2026-01-19T09:25:58.241723303Z" level=info msg="connecting to shim e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf" address="unix:///run/containerd/s/c663bd7e7f7ace4214b4e6ba1b09322776babf8109134ce90184dfd1ca37d5d0" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:25:58.268549 systemd[1]: Started cri-containerd-e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf.scope - libcontainer container e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf. Jan 19 09:25:58.296000 audit: BPF prog-id=141 op=LOAD Jan 19 09:25:58.296000 audit: BPF prog-id=142 op=LOAD Jan 19 09:25:58.296000 audit[3075]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3063 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.296000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539383636646131613339393535633064316465636135323338373230 Jan 19 09:25:58.296000 audit: BPF prog-id=142 op=UNLOAD Jan 19 09:25:58.296000 audit[3075]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.296000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539383636646131613339393535633064316465636135323338373230 Jan 19 09:25:58.296000 audit: BPF prog-id=143 op=LOAD Jan 19 09:25:58.296000 audit[3075]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3063 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.296000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539383636646131613339393535633064316465636135323338373230 Jan 19 09:25:58.296000 audit: BPF prog-id=144 op=LOAD Jan 19 09:25:58.296000 audit[3075]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3063 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.296000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539383636646131613339393535633064316465636135323338373230 Jan 19 09:25:58.296000 audit: BPF prog-id=144 op=UNLOAD Jan 19 09:25:58.296000 audit[3075]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.296000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539383636646131613339393535633064316465636135323338373230 Jan 19 09:25:58.296000 audit: BPF prog-id=143 op=UNLOAD Jan 19 09:25:58.296000 audit[3075]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.296000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539383636646131613339393535633064316465636135323338373230 Jan 19 09:25:58.296000 audit: BPF prog-id=145 op=LOAD Jan 19 09:25:58.296000 audit[3075]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3063 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.296000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539383636646131613339393535633064316465636135323338373230 Jan 19 09:25:58.336103 containerd[1676]: time="2026-01-19T09:25:58.335941916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-b5zsr,Uid:7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf\"" Jan 19 09:25:58.337485 containerd[1676]: time="2026-01-19T09:25:58.337441995Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 19 09:25:58.474000 audit[3131]: NETFILTER_CFG table=mangle:54 family=10 entries=1 op=nft_register_chain pid=3131 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.475000 audit[3130]: NETFILTER_CFG table=mangle:55 family=2 entries=1 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.474000 audit[3131]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff1881b6e0 a2=0 a3=7fff1881b6cc items=0 ppid=3032 pid=3131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.474000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 19 09:25:58.475000 audit[3130]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffefd6c0b10 a2=0 a3=8ab541c964de8acd items=0 ppid=3032 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.475000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 19 09:25:58.481000 audit[3135]: NETFILTER_CFG table=nat:56 family=10 entries=1 op=nft_register_chain pid=3135 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.481000 audit[3135]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc8aa098b0 a2=0 a3=7ffc8aa0989c items=0 ppid=3032 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.481000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 19 09:25:58.484000 audit[3136]: NETFILTER_CFG table=filter:57 family=10 entries=1 op=nft_register_chain pid=3136 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.484000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe951bdd30 a2=0 a3=7ffe951bdd1c items=0 ppid=3032 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.484000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 19 09:25:58.493000 audit[3138]: NETFILTER_CFG table=nat:58 family=2 entries=1 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.493000 audit[3138]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc72d39770 a2=0 a3=7ffc72d3975c items=0 ppid=3032 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.493000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 19 09:25:58.499000 audit[3139]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3139 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.499000 audit[3139]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc1a097ff0 a2=0 a3=7ffc1a097fdc items=0 ppid=3032 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.499000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 19 09:25:58.583000 audit[3140]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.583000 audit[3140]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffde9332370 a2=0 a3=7ffde933235c items=0 ppid=3032 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.583000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 19 09:25:58.590000 audit[3142]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.590000 audit[3142]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc982cad50 a2=0 a3=7ffc982cad3c items=0 ppid=3032 pid=3142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.590000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Jan 19 09:25:58.600000 audit[3145]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.600000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffcf171c8e0 a2=0 a3=7ffcf171c8cc items=0 ppid=3032 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.600000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 19 09:25:58.604000 audit[3146]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.604000 audit[3146]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff11db4030 a2=0 a3=7fff11db401c items=0 ppid=3032 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.604000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 19 09:25:58.611000 audit[3148]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.611000 audit[3148]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffecfdcad20 a2=0 a3=7ffecfdcad0c items=0 ppid=3032 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.611000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 19 09:25:58.614000 audit[3149]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.614000 audit[3149]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc98f45a50 a2=0 a3=7ffc98f45a3c items=0 ppid=3032 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.614000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 19 09:25:58.620000 audit[3151]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.620000 audit[3151]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd0bb68080 a2=0 a3=7ffd0bb6806c items=0 ppid=3032 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.620000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 19 09:25:58.630000 audit[3154]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.630000 audit[3154]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff035d8670 a2=0 a3=7fff035d865c items=0 ppid=3032 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.630000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 19 09:25:58.632000 audit[3155]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.632000 audit[3155]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff30592cd0 a2=0 a3=7fff30592cbc items=0 ppid=3032 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.632000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 19 09:25:58.639000 audit[3157]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3157 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.639000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcaca882d0 a2=0 a3=7ffcaca882bc items=0 ppid=3032 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.639000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 19 09:25:58.642000 audit[3158]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3158 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.642000 audit[3158]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffeb2994a10 a2=0 a3=7ffeb29949fc items=0 ppid=3032 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.642000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 19 09:25:58.649000 audit[3160]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3160 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.649000 audit[3160]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd7100e0e0 a2=0 a3=7ffd7100e0cc items=0 ppid=3032 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.649000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Jan 19 09:25:58.662000 audit[3163]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3163 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.662000 audit[3163]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff39100110 a2=0 a3=7fff391000fc items=0 ppid=3032 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.662000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 19 09:25:58.672000 audit[3166]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3166 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.672000 audit[3166]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffddf3e86c0 a2=0 a3=7ffddf3e86ac items=0 ppid=3032 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.672000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 19 09:25:58.676000 audit[3167]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.676000 audit[3167]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffed19a7530 a2=0 a3=7ffed19a751c items=0 ppid=3032 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.676000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 19 09:25:58.683000 audit[3169]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3169 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.683000 audit[3169]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc4b145000 a2=0 a3=7ffc4b144fec items=0 ppid=3032 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.683000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 19 09:25:58.693000 audit[3172]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3172 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.693000 audit[3172]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe6f7f9cb0 a2=0 a3=7ffe6f7f9c9c items=0 ppid=3032 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.693000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 19 09:25:58.696000 audit[3173]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3173 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.696000 audit[3173]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffec8d43d70 a2=0 a3=7ffec8d43d5c items=0 ppid=3032 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.696000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 19 09:25:58.703000 audit[3175]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3175 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 19 09:25:58.703000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffcd7ce4000 a2=0 a3=7ffcd7ce3fec items=0 ppid=3032 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.703000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 19 09:25:58.748000 audit[3181]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3181 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:25:58.748000 audit[3181]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffff3572310 a2=0 a3=7ffff35722fc items=0 ppid=3032 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.748000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:25:58.757000 audit[3181]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3181 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:25:58.757000 audit[3181]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffff3572310 a2=0 a3=7ffff35722fc items=0 ppid=3032 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.757000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:25:58.760000 audit[3186]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.760000 audit[3186]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffd5b91e390 a2=0 a3=7ffd5b91e37c items=0 ppid=3032 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.760000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 19 09:25:58.764000 audit[3188]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3188 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.764000 audit[3188]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7fff8541d090 a2=0 a3=7fff8541d07c items=0 ppid=3032 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.764000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 19 09:25:58.771000 audit[3191]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3191 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.771000 audit[3191]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff955c1cf0 a2=0 a3=7fff955c1cdc items=0 ppid=3032 pid=3191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.771000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Jan 19 09:25:58.774000 audit[3192]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.774000 audit[3192]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffea453ae80 a2=0 a3=7ffea453ae6c items=0 ppid=3032 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.774000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 19 09:25:58.778000 audit[3194]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3194 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.778000 audit[3194]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd3c6faa60 a2=0 a3=7ffd3c6faa4c items=0 ppid=3032 pid=3194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.778000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 19 09:25:58.780000 audit[3195]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3195 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.780000 audit[3195]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff93aa0620 a2=0 a3=7fff93aa060c items=0 ppid=3032 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.780000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 19 09:25:58.783000 audit[3197]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3197 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.783000 audit[3197]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffdd3cd0610 a2=0 a3=7ffdd3cd05fc items=0 ppid=3032 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.783000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 19 09:25:58.787000 audit[3200]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3200 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.787000 audit[3200]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffcfdc75f70 a2=0 a3=7ffcfdc75f5c items=0 ppid=3032 pid=3200 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.787000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 19 09:25:58.789000 audit[3201]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3201 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.789000 audit[3201]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd09cc5490 a2=0 a3=7ffd09cc547c items=0 ppid=3032 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.789000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 19 09:25:58.792000 audit[3203]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.792000 audit[3203]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc1496b4b0 a2=0 a3=7ffc1496b49c items=0 ppid=3032 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.792000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 19 09:25:58.793000 audit[3204]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3204 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.793000 audit[3204]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc6597e8b0 a2=0 a3=7ffc6597e89c items=0 ppid=3032 pid=3204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.793000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 19 09:25:58.797000 audit[3206]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3206 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.797000 audit[3206]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe710c3410 a2=0 a3=7ffe710c33fc items=0 ppid=3032 pid=3206 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.797000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 19 09:25:58.801000 audit[3209]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3209 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.801000 audit[3209]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff04d3e0a0 a2=0 a3=7fff04d3e08c items=0 ppid=3032 pid=3209 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.801000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 19 09:25:58.805000 audit[3212]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3212 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.805000 audit[3212]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffcbb641320 a2=0 a3=7ffcbb64130c items=0 ppid=3032 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.805000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Jan 19 09:25:58.807000 audit[3213]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.807000 audit[3213]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffefccbc560 a2=0 a3=7ffefccbc54c items=0 ppid=3032 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.807000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 19 09:25:58.810000 audit[3215]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3215 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.810000 audit[3215]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffd56793a90 a2=0 a3=7ffd56793a7c items=0 ppid=3032 pid=3215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.810000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 19 09:25:58.814000 audit[3218]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.814000 audit[3218]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc5c8268e0 a2=0 a3=7ffc5c8268cc items=0 ppid=3032 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.814000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 19 09:25:58.815000 audit[3219]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3219 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.815000 audit[3219]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe97e5f910 a2=0 a3=7ffe97e5f8fc items=0 ppid=3032 pid=3219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.815000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 19 09:25:58.820000 audit[3221]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3221 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.820000 audit[3221]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffe4c239ea0 a2=0 a3=7ffe4c239e8c items=0 ppid=3032 pid=3221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.820000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 19 09:25:58.823000 audit[3222]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3222 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.823000 audit[3222]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff54a968d0 a2=0 a3=7fff54a968bc items=0 ppid=3032 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.823000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 19 09:25:58.828000 audit[3224]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.828000 audit[3224]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd02ff0b00 a2=0 a3=7ffd02ff0aec items=0 ppid=3032 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.828000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 19 09:25:58.832000 audit[3227]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3227 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 19 09:25:58.832000 audit[3227]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fffe558c3f0 a2=0 a3=7fffe558c3dc items=0 ppid=3032 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.832000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 19 09:25:58.837000 audit[3229]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3229 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 19 09:25:58.837000 audit[3229]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffd77e8c760 a2=0 a3=7ffd77e8c74c items=0 ppid=3032 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.837000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:25:58.838000 audit[3229]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3229 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 19 09:25:58.838000 audit[3229]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffd77e8c760 a2=0 a3=7ffd77e8c74c items=0 ppid=3032 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:25:58.838000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:00.398796 kubelet[2926]: I0119 09:26:00.398718 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-rfg2s" podStartSLOduration=3.39869389 podStartE2EDuration="3.39869389s" podCreationTimestamp="2026-01-19 09:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-19 09:25:58.279575818 +0000 UTC m=+8.176394569" watchObservedRunningTime="2026-01-19 09:26:00.39869389 +0000 UTC m=+10.295512681" Jan 19 09:26:02.142255 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount881582319.mount: Deactivated successfully. Jan 19 09:26:02.564575 containerd[1676]: time="2026-01-19T09:26:02.564521283Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:02.565905 containerd[1676]: time="2026-01-19T09:26:02.565814526Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=0" Jan 19 09:26:02.566783 containerd[1676]: time="2026-01-19T09:26:02.566760351Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:02.568975 containerd[1676]: time="2026-01-19T09:26:02.568538127Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:02.568975 containerd[1676]: time="2026-01-19T09:26:02.568883136Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 4.23140032s" Jan 19 09:26:02.568975 containerd[1676]: time="2026-01-19T09:26:02.568903156Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 19 09:26:02.572943 containerd[1676]: time="2026-01-19T09:26:02.572904689Z" level=info msg="CreateContainer within sandbox \"e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 19 09:26:02.582432 containerd[1676]: time="2026-01-19T09:26:02.580438664Z" level=info msg="Container 8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:26:02.593683 containerd[1676]: time="2026-01-19T09:26:02.593635904Z" level=info msg="CreateContainer within sandbox \"e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22\"" Jan 19 09:26:02.594188 containerd[1676]: time="2026-01-19T09:26:02.594174208Z" level=info msg="StartContainer for \"8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22\"" Jan 19 09:26:02.595225 containerd[1676]: time="2026-01-19T09:26:02.595200924Z" level=info msg="connecting to shim 8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22" address="unix:///run/containerd/s/c663bd7e7f7ace4214b4e6ba1b09322776babf8109134ce90184dfd1ca37d5d0" protocol=ttrpc version=3 Jan 19 09:26:02.618584 systemd[1]: Started cri-containerd-8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22.scope - libcontainer container 8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22. Jan 19 09:26:02.627000 audit: BPF prog-id=146 op=LOAD Jan 19 09:26:02.627000 audit: BPF prog-id=147 op=LOAD Jan 19 09:26:02.627000 audit[3239]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3063 pid=3239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:02.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861336166363232613535363664393434346632643863393030386438 Jan 19 09:26:02.627000 audit: BPF prog-id=147 op=UNLOAD Jan 19 09:26:02.627000 audit[3239]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=3239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:02.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861336166363232613535363664393434346632643863393030386438 Jan 19 09:26:02.627000 audit: BPF prog-id=148 op=LOAD Jan 19 09:26:02.627000 audit[3239]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3063 pid=3239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:02.627000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861336166363232613535363664393434346632643863393030386438 Jan 19 09:26:02.628000 audit: BPF prog-id=149 op=LOAD Jan 19 09:26:02.628000 audit[3239]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3063 pid=3239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:02.628000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861336166363232613535363664393434346632643863393030386438 Jan 19 09:26:02.628000 audit: BPF prog-id=149 op=UNLOAD Jan 19 09:26:02.628000 audit[3239]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=3239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:02.628000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861336166363232613535363664393434346632643863393030386438 Jan 19 09:26:02.628000 audit: BPF prog-id=148 op=UNLOAD Jan 19 09:26:02.628000 audit[3239]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=3239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:02.628000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861336166363232613535363664393434346632643863393030386438 Jan 19 09:26:02.628000 audit: BPF prog-id=150 op=LOAD Jan 19 09:26:02.628000 audit[3239]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3063 pid=3239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:02.628000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3861336166363232613535363664393434346632643863393030386438 Jan 19 09:26:02.644524 containerd[1676]: time="2026-01-19T09:26:02.644386723Z" level=info msg="StartContainer for \"8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22\" returns successfully" Jan 19 09:26:04.196914 kubelet[2926]: I0119 09:26:04.196739 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-b5zsr" podStartSLOduration=2.964148067 podStartE2EDuration="7.19672032s" podCreationTimestamp="2026-01-19 09:25:57 +0000 UTC" firstStartedPulling="2026-01-19 09:25:58.337039972 +0000 UTC m=+8.233858723" lastFinishedPulling="2026-01-19 09:26:02.569612215 +0000 UTC m=+12.466430976" observedRunningTime="2026-01-19 09:26:03.291637699 +0000 UTC m=+13.188456490" watchObservedRunningTime="2026-01-19 09:26:04.19672032 +0000 UTC m=+14.093539081" Jan 19 09:26:08.339968 sudo[1960]: pam_unix(sudo:session): session closed for user root Jan 19 09:26:08.348435 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 19 09:26:08.348576 kernel: audit: type=1106 audit(1768814768.339:528): pid=1960 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:26:08.339000 audit[1960]: USER_END pid=1960 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:26:08.358009 kernel: audit: type=1104 audit(1768814768.339:529): pid=1960 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:26:08.339000 audit[1960]: CRED_DISP pid=1960 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 19 09:26:08.460800 sshd[1959]: Connection closed by 4.153.228.146 port 57776 Jan 19 09:26:08.462807 sshd-session[1955]: pam_unix(sshd:session): session closed for user core Jan 19 09:26:08.462000 audit[1955]: USER_END pid=1955 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:26:08.472419 kernel: audit: type=1106 audit(1768814768.462:530): pid=1955 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:26:08.476063 systemd[1]: sshd@7-77.42.19.180:22-4.153.228.146:57776.service: Deactivated successfully. Jan 19 09:26:08.470000 audit[1955]: CRED_DISP pid=1955 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:26:08.481220 systemd[1]: session-8.scope: Deactivated successfully. Jan 19 09:26:08.481883 systemd[1]: session-8.scope: Consumed 4.991s CPU time, 234.1M memory peak. Jan 19 09:26:08.482407 kernel: audit: type=1104 audit(1768814768.470:531): pid=1955 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:26:08.477000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.19.180:22-4.153.228.146:57776 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:26:08.484632 systemd-logind[1647]: Session 8 logged out. Waiting for processes to exit. Jan 19 09:26:08.488160 systemd-logind[1647]: Removed session 8. Jan 19 09:26:08.490421 kernel: audit: type=1131 audit(1768814768.477:532): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-77.42.19.180:22-4.153.228.146:57776 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:26:09.067000 audit[3321]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:09.073471 kernel: audit: type=1325 audit(1768814769.067:533): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:09.067000 audit[3321]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffd63e33a70 a2=0 a3=7ffd63e33a5c items=0 ppid=3032 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:09.083425 kernel: audit: type=1300 audit(1768814769.067:533): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffd63e33a70 a2=0 a3=7ffd63e33a5c items=0 ppid=3032 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:09.067000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:09.093435 kernel: audit: type=1327 audit(1768814769.067:533): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:09.093498 kernel: audit: type=1325 audit(1768814769.073:534): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:09.073000 audit[3321]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:09.073000 audit[3321]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd63e33a70 a2=0 a3=0 items=0 ppid=3032 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:09.108429 kernel: audit: type=1300 audit(1768814769.073:534): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd63e33a70 a2=0 a3=0 items=0 ppid=3032 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:09.073000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:10.096000 audit[3323]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:10.096000 audit[3323]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc87c8b590 a2=0 a3=7ffc87c8b57c items=0 ppid=3032 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:10.096000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:10.100000 audit[3323]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:10.100000 audit[3323]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc87c8b590 a2=0 a3=0 items=0 ppid=3032 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:10.100000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:11.119000 audit[3327]: NETFILTER_CFG table=filter:109 family=2 entries=18 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:11.119000 audit[3327]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffdf65c04f0 a2=0 a3=7ffdf65c04dc items=0 ppid=3032 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:11.119000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:11.126000 audit[3327]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:11.126000 audit[3327]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdf65c04f0 a2=0 a3=0 items=0 ppid=3032 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:11.126000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:12.384000 audit[3329]: NETFILTER_CFG table=filter:111 family=2 entries=21 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:12.384000 audit[3329]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffeaea73e40 a2=0 a3=7ffeaea73e2c items=0 ppid=3032 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.384000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:12.387000 audit[3329]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:12.387000 audit[3329]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffeaea73e40 a2=0 a3=0 items=0 ppid=3032 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.387000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:12.438858 systemd[1]: Created slice kubepods-besteffort-pod34707322_eca1_4e91_b4cf_5a3d076b44f5.slice - libcontainer container kubepods-besteffort-pod34707322_eca1_4e91_b4cf_5a3d076b44f5.slice. Jan 19 09:26:12.447360 kubelet[2926]: I0119 09:26:12.447302 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34707322-eca1-4e91-b4cf-5a3d076b44f5-tigera-ca-bundle\") pod \"calico-typha-8666647ccf-bkmvf\" (UID: \"34707322-eca1-4e91-b4cf-5a3d076b44f5\") " pod="calico-system/calico-typha-8666647ccf-bkmvf" Jan 19 09:26:12.447360 kubelet[2926]: I0119 09:26:12.447328 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/34707322-eca1-4e91-b4cf-5a3d076b44f5-typha-certs\") pod \"calico-typha-8666647ccf-bkmvf\" (UID: \"34707322-eca1-4e91-b4cf-5a3d076b44f5\") " pod="calico-system/calico-typha-8666647ccf-bkmvf" Jan 19 09:26:12.447360 kubelet[2926]: I0119 09:26:12.447340 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bvhk\" (UniqueName: \"kubernetes.io/projected/34707322-eca1-4e91-b4cf-5a3d076b44f5-kube-api-access-4bvhk\") pod \"calico-typha-8666647ccf-bkmvf\" (UID: \"34707322-eca1-4e91-b4cf-5a3d076b44f5\") " pod="calico-system/calico-typha-8666647ccf-bkmvf" Jan 19 09:26:12.621257 systemd[1]: Created slice kubepods-besteffort-pod95c50da7_f0be_44f4_9371_f1b8d2b1156d.slice - libcontainer container kubepods-besteffort-pod95c50da7_f0be_44f4_9371_f1b8d2b1156d.slice. Jan 19 09:26:12.649150 kubelet[2926]: I0119 09:26:12.648447 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95c50da7-f0be-44f4-9371-f1b8d2b1156d-lib-modules\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649150 kubelet[2926]: I0119 09:26:12.648514 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/95c50da7-f0be-44f4-9371-f1b8d2b1156d-policysync\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649150 kubelet[2926]: I0119 09:26:12.648527 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95c50da7-f0be-44f4-9371-f1b8d2b1156d-tigera-ca-bundle\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649150 kubelet[2926]: I0119 09:26:12.648537 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/95c50da7-f0be-44f4-9371-f1b8d2b1156d-xtables-lock\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649150 kubelet[2926]: I0119 09:26:12.648550 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/95c50da7-f0be-44f4-9371-f1b8d2b1156d-var-lib-calico\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649522 kubelet[2926]: I0119 09:26:12.648587 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/95c50da7-f0be-44f4-9371-f1b8d2b1156d-cni-log-dir\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649522 kubelet[2926]: I0119 09:26:12.648606 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82mvf\" (UniqueName: \"kubernetes.io/projected/95c50da7-f0be-44f4-9371-f1b8d2b1156d-kube-api-access-82mvf\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649522 kubelet[2926]: I0119 09:26:12.648618 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/95c50da7-f0be-44f4-9371-f1b8d2b1156d-var-run-calico\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649522 kubelet[2926]: I0119 09:26:12.648662 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/95c50da7-f0be-44f4-9371-f1b8d2b1156d-cni-net-dir\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649522 kubelet[2926]: I0119 09:26:12.648671 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/95c50da7-f0be-44f4-9371-f1b8d2b1156d-flexvol-driver-host\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649767 kubelet[2926]: I0119 09:26:12.648680 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/95c50da7-f0be-44f4-9371-f1b8d2b1156d-node-certs\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.649767 kubelet[2926]: I0119 09:26:12.648691 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/95c50da7-f0be-44f4-9371-f1b8d2b1156d-cni-bin-dir\") pod \"calico-node-2vxc2\" (UID: \"95c50da7-f0be-44f4-9371-f1b8d2b1156d\") " pod="calico-system/calico-node-2vxc2" Jan 19 09:26:12.747232 containerd[1676]: time="2026-01-19T09:26:12.746959674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8666647ccf-bkmvf,Uid:34707322-eca1-4e91-b4cf-5a3d076b44f5,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:12.754584 kubelet[2926]: E0119 09:26:12.754471 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.754584 kubelet[2926]: W0119 09:26:12.754507 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.754584 kubelet[2926]: E0119 09:26:12.754525 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.759803 kubelet[2926]: E0119 09:26:12.759748 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.759803 kubelet[2926]: W0119 09:26:12.759768 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.759803 kubelet[2926]: E0119 09:26:12.759786 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.760242 kubelet[2926]: E0119 09:26:12.760204 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.760242 kubelet[2926]: W0119 09:26:12.760218 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.760242 kubelet[2926]: E0119 09:26:12.760229 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.760633 kubelet[2926]: E0119 09:26:12.760604 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.760633 kubelet[2926]: W0119 09:26:12.760613 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.760633 kubelet[2926]: E0119 09:26:12.760622 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.760968 kubelet[2926]: E0119 09:26:12.760940 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.760968 kubelet[2926]: W0119 09:26:12.760950 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.760968 kubelet[2926]: E0119 09:26:12.760958 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.761303 kubelet[2926]: E0119 09:26:12.761250 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.761303 kubelet[2926]: W0119 09:26:12.761259 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.761303 kubelet[2926]: E0119 09:26:12.761267 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.761587 kubelet[2926]: E0119 09:26:12.761557 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.761587 kubelet[2926]: W0119 09:26:12.761566 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.761587 kubelet[2926]: E0119 09:26:12.761572 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.785531 kubelet[2926]: E0119 09:26:12.785448 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.785531 kubelet[2926]: W0119 09:26:12.785477 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.785531 kubelet[2926]: E0119 09:26:12.785496 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.793967 containerd[1676]: time="2026-01-19T09:26:12.793913188Z" level=info msg="connecting to shim aad28f601f9baec9b2a45cd9f9b9ce09240d6c56e8b9599026ffac784e75f247" address="unix:///run/containerd/s/0c9e9f80f079008c55fa4bf8492ac2997a5790a5c935fa79d1ddfc5fa5ec9d2b" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:12.829250 kubelet[2926]: E0119 09:26:12.829214 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:12.842226 kubelet[2926]: E0119 09:26:12.841892 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.842652 kubelet[2926]: W0119 09:26:12.842637 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.843009 kubelet[2926]: E0119 09:26:12.842913 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.844064 kubelet[2926]: E0119 09:26:12.843919 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.844427 kubelet[2926]: W0119 09:26:12.844344 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.844427 kubelet[2926]: E0119 09:26:12.844360 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.845223 kubelet[2926]: E0119 09:26:12.845212 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.845559 kubelet[2926]: W0119 09:26:12.845311 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.845559 kubelet[2926]: E0119 09:26:12.845323 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.846866 systemd[1]: Started cri-containerd-aad28f601f9baec9b2a45cd9f9b9ce09240d6c56e8b9599026ffac784e75f247.scope - libcontainer container aad28f601f9baec9b2a45cd9f9b9ce09240d6c56e8b9599026ffac784e75f247. Jan 19 09:26:12.848595 kubelet[2926]: E0119 09:26:12.848386 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.848595 kubelet[2926]: W0119 09:26:12.848413 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.848595 kubelet[2926]: E0119 09:26:12.848423 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.850578 kubelet[2926]: E0119 09:26:12.849766 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.850578 kubelet[2926]: W0119 09:26:12.849777 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.850578 kubelet[2926]: E0119 09:26:12.849787 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.850578 kubelet[2926]: E0119 09:26:12.850118 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.850578 kubelet[2926]: W0119 09:26:12.850126 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.850578 kubelet[2926]: E0119 09:26:12.850133 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.851053 kubelet[2926]: E0119 09:26:12.850827 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.851053 kubelet[2926]: W0119 09:26:12.850837 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.851053 kubelet[2926]: E0119 09:26:12.850844 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.851695 kubelet[2926]: E0119 09:26:12.851685 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.851739 kubelet[2926]: W0119 09:26:12.851732 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.851770 kubelet[2926]: E0119 09:26:12.851763 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.854788 kubelet[2926]: E0119 09:26:12.854443 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.854788 kubelet[2926]: W0119 09:26:12.854554 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.854788 kubelet[2926]: E0119 09:26:12.854563 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.857085 kubelet[2926]: E0119 09:26:12.855960 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.857085 kubelet[2926]: W0119 09:26:12.855975 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.857085 kubelet[2926]: E0119 09:26:12.855984 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.857085 kubelet[2926]: E0119 09:26:12.856170 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.857085 kubelet[2926]: W0119 09:26:12.856177 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.857085 kubelet[2926]: E0119 09:26:12.856184 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.857085 kubelet[2926]: E0119 09:26:12.856341 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.857085 kubelet[2926]: W0119 09:26:12.856348 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.857085 kubelet[2926]: E0119 09:26:12.856357 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.857085 kubelet[2926]: E0119 09:26:12.856593 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.857288 kubelet[2926]: W0119 09:26:12.856599 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.857288 kubelet[2926]: E0119 09:26:12.856606 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.857288 kubelet[2926]: E0119 09:26:12.856744 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.857288 kubelet[2926]: W0119 09:26:12.856750 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.857288 kubelet[2926]: E0119 09:26:12.856755 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.857288 kubelet[2926]: E0119 09:26:12.856888 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.857288 kubelet[2926]: W0119 09:26:12.856893 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.857288 kubelet[2926]: E0119 09:26:12.856899 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.857288 kubelet[2926]: E0119 09:26:12.857053 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.857288 kubelet[2926]: W0119 09:26:12.857061 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.857485 kubelet[2926]: E0119 09:26:12.857070 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.858110 kubelet[2926]: E0119 09:26:12.857938 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.858110 kubelet[2926]: W0119 09:26:12.857950 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.858110 kubelet[2926]: E0119 09:26:12.857957 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.858920 kubelet[2926]: E0119 09:26:12.858516 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.858920 kubelet[2926]: W0119 09:26:12.858525 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.858920 kubelet[2926]: E0119 09:26:12.858532 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.859112 kubelet[2926]: E0119 09:26:12.859056 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.859112 kubelet[2926]: W0119 09:26:12.859068 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.859112 kubelet[2926]: E0119 09:26:12.859076 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.859330 kubelet[2926]: E0119 09:26:12.859322 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.859380 kubelet[2926]: W0119 09:26:12.859360 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.859380 kubelet[2926]: E0119 09:26:12.859368 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.859804 kubelet[2926]: E0119 09:26:12.859795 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.859845 kubelet[2926]: W0119 09:26:12.859838 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.859874 kubelet[2926]: E0119 09:26:12.859867 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.859980 kubelet[2926]: I0119 09:26:12.859925 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf97e1c1-8e02-43e7-988e-adabcb45ce04-kubelet-dir\") pod \"csi-node-driver-km4zx\" (UID: \"cf97e1c1-8e02-43e7-988e-adabcb45ce04\") " pod="calico-system/csi-node-driver-km4zx" Jan 19 09:26:12.860286 kubelet[2926]: E0119 09:26:12.860277 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.860328 kubelet[2926]: W0119 09:26:12.860321 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.860358 kubelet[2926]: E0119 09:26:12.860351 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.860715 kubelet[2926]: E0119 09:26:12.860705 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.860757 kubelet[2926]: W0119 09:26:12.860750 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.860797 kubelet[2926]: E0119 09:26:12.860790 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.860893 kubelet[2926]: I0119 09:26:12.860838 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cf97e1c1-8e02-43e7-988e-adabcb45ce04-registration-dir\") pod \"csi-node-driver-km4zx\" (UID: \"cf97e1c1-8e02-43e7-988e-adabcb45ce04\") " pod="calico-system/csi-node-driver-km4zx" Jan 19 09:26:12.863331 kubelet[2926]: E0119 09:26:12.863102 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.863331 kubelet[2926]: W0119 09:26:12.863111 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.863331 kubelet[2926]: E0119 09:26:12.863119 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.863612 kubelet[2926]: E0119 09:26:12.863603 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.864265 kubelet[2926]: W0119 09:26:12.863653 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.864265 kubelet[2926]: E0119 09:26:12.863663 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.864265 kubelet[2926]: E0119 09:26:12.864085 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.864265 kubelet[2926]: W0119 09:26:12.864091 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.864265 kubelet[2926]: E0119 09:26:12.864098 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.864505 kubelet[2926]: E0119 09:26:12.864384 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.864505 kubelet[2926]: W0119 09:26:12.864411 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.864505 kubelet[2926]: E0119 09:26:12.864417 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.864505 kubelet[2926]: I0119 09:26:12.864439 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cf97e1c1-8e02-43e7-988e-adabcb45ce04-socket-dir\") pod \"csi-node-driver-km4zx\" (UID: \"cf97e1c1-8e02-43e7-988e-adabcb45ce04\") " pod="calico-system/csi-node-driver-km4zx" Jan 19 09:26:12.864827 kubelet[2926]: E0119 09:26:12.864817 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.864944 kubelet[2926]: W0119 09:26:12.864868 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.864993 kubelet[2926]: E0119 09:26:12.864983 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.865047 kubelet[2926]: I0119 09:26:12.865037 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/cf97e1c1-8e02-43e7-988e-adabcb45ce04-varrun\") pod \"csi-node-driver-km4zx\" (UID: \"cf97e1c1-8e02-43e7-988e-adabcb45ce04\") " pod="calico-system/csi-node-driver-km4zx" Jan 19 09:26:12.865443 kubelet[2926]: E0119 09:26:12.865434 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.865497 kubelet[2926]: W0119 09:26:12.865490 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.865528 kubelet[2926]: E0119 09:26:12.865521 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.865644 kubelet[2926]: I0119 09:26:12.865577 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsltn\" (UniqueName: \"kubernetes.io/projected/cf97e1c1-8e02-43e7-988e-adabcb45ce04-kube-api-access-xsltn\") pod \"csi-node-driver-km4zx\" (UID: \"cf97e1c1-8e02-43e7-988e-adabcb45ce04\") " pod="calico-system/csi-node-driver-km4zx" Jan 19 09:26:12.866046 kubelet[2926]: E0119 09:26:12.866024 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.866139 kubelet[2926]: W0119 09:26:12.866104 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.866214 kubelet[2926]: E0119 09:26:12.866184 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.866574 kubelet[2926]: E0119 09:26:12.866564 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.866614 kubelet[2926]: W0119 09:26:12.866607 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.866643 kubelet[2926]: E0119 09:26:12.866636 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.867188 kubelet[2926]: E0119 09:26:12.867091 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.867272 kubelet[2926]: W0119 09:26:12.867262 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.867313 kubelet[2926]: E0119 09:26:12.867305 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.867605 kubelet[2926]: E0119 09:26:12.867597 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.867663 kubelet[2926]: W0119 09:26:12.867655 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.867743 kubelet[2926]: E0119 09:26:12.867735 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.869047 kubelet[2926]: E0119 09:26:12.869034 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.869207 kubelet[2926]: W0119 09:26:12.869194 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.869267 kubelet[2926]: E0119 09:26:12.869257 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.869865 kubelet[2926]: E0119 09:26:12.869842 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.870006 kubelet[2926]: W0119 09:26:12.869945 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.870064 kubelet[2926]: E0119 09:26:12.870053 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.891000 audit: BPF prog-id=151 op=LOAD Jan 19 09:26:12.893000 audit: BPF prog-id=152 op=LOAD Jan 19 09:26:12.893000 audit[3362]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3350 pid=3362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161643238663630316639626165633962326134356364396639623963 Jan 19 09:26:12.893000 audit: BPF prog-id=152 op=UNLOAD Jan 19 09:26:12.893000 audit[3362]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3350 pid=3362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161643238663630316639626165633962326134356364396639623963 Jan 19 09:26:12.893000 audit: BPF prog-id=153 op=LOAD Jan 19 09:26:12.893000 audit[3362]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3350 pid=3362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161643238663630316639626165633962326134356364396639623963 Jan 19 09:26:12.893000 audit: BPF prog-id=154 op=LOAD Jan 19 09:26:12.893000 audit[3362]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3350 pid=3362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161643238663630316639626165633962326134356364396639623963 Jan 19 09:26:12.893000 audit: BPF prog-id=154 op=UNLOAD Jan 19 09:26:12.893000 audit[3362]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3350 pid=3362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161643238663630316639626165633962326134356364396639623963 Jan 19 09:26:12.893000 audit: BPF prog-id=153 op=UNLOAD Jan 19 09:26:12.893000 audit[3362]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3350 pid=3362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161643238663630316639626165633962326134356364396639623963 Jan 19 09:26:12.893000 audit: BPF prog-id=155 op=LOAD Jan 19 09:26:12.893000 audit[3362]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3350 pid=3362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161643238663630316639626165633962326134356364396639623963 Jan 19 09:26:12.930966 containerd[1676]: time="2026-01-19T09:26:12.928646533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2vxc2,Uid:95c50da7-f0be-44f4-9371-f1b8d2b1156d,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:12.942487 containerd[1676]: time="2026-01-19T09:26:12.942416755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8666647ccf-bkmvf,Uid:34707322-eca1-4e91-b4cf-5a3d076b44f5,Namespace:calico-system,Attempt:0,} returns sandbox id \"aad28f601f9baec9b2a45cd9f9b9ce09240d6c56e8b9599026ffac784e75f247\"" Jan 19 09:26:12.943914 containerd[1676]: time="2026-01-19T09:26:12.943771434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 19 09:26:12.952291 containerd[1676]: time="2026-01-19T09:26:12.952264312Z" level=info msg="connecting to shim 8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee" address="unix:///run/containerd/s/595d62d79192058cd2ab87446e4025637da4cac425267a38771d7e8db85ee68c" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:12.967739 kubelet[2926]: E0119 09:26:12.967714 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.968201 kubelet[2926]: W0119 09:26:12.967826 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.968201 kubelet[2926]: E0119 09:26:12.967847 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.968725 kubelet[2926]: E0119 09:26:12.968663 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.968725 kubelet[2926]: W0119 09:26:12.968681 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.968725 kubelet[2926]: E0119 09:26:12.968723 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.968995 kubelet[2926]: E0119 09:26:12.968969 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.968995 kubelet[2926]: W0119 09:26:12.968981 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.968995 kubelet[2926]: E0119 09:26:12.968987 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.969241 kubelet[2926]: E0119 09:26:12.969195 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.969241 kubelet[2926]: W0119 09:26:12.969222 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.969241 kubelet[2926]: E0119 09:26:12.969228 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.969591 kubelet[2926]: E0119 09:26:12.969576 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.969591 kubelet[2926]: W0119 09:26:12.969590 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.969647 kubelet[2926]: E0119 09:26:12.969600 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.969984 kubelet[2926]: E0119 09:26:12.969924 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.969984 kubelet[2926]: W0119 09:26:12.969935 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.969984 kubelet[2926]: E0119 09:26:12.969943 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.970269 kubelet[2926]: E0119 09:26:12.970237 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.970377 kubelet[2926]: W0119 09:26:12.970353 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.970377 kubelet[2926]: E0119 09:26:12.970371 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.970808 kubelet[2926]: E0119 09:26:12.970788 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.970808 kubelet[2926]: W0119 09:26:12.970800 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.970808 kubelet[2926]: E0119 09:26:12.970807 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.971109 kubelet[2926]: E0119 09:26:12.971097 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.971109 kubelet[2926]: W0119 09:26:12.971105 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.971109 kubelet[2926]: E0119 09:26:12.971113 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.971373 kubelet[2926]: E0119 09:26:12.971359 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.971373 kubelet[2926]: W0119 09:26:12.971368 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.971448 kubelet[2926]: E0119 09:26:12.971376 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.971869 kubelet[2926]: E0119 09:26:12.971850 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.971869 kubelet[2926]: W0119 09:26:12.971862 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.971869 kubelet[2926]: E0119 09:26:12.971869 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.972190 kubelet[2926]: E0119 09:26:12.972170 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.972190 kubelet[2926]: W0119 09:26:12.972182 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.972190 kubelet[2926]: E0119 09:26:12.972189 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.972444 kubelet[2926]: E0119 09:26:12.972421 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.972444 kubelet[2926]: W0119 09:26:12.972430 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.972444 kubelet[2926]: E0119 09:26:12.972439 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.972625 systemd[1]: Started cri-containerd-8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee.scope - libcontainer container 8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee. Jan 19 09:26:12.973192 kubelet[2926]: E0119 09:26:12.973178 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.973192 kubelet[2926]: W0119 09:26:12.973192 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.973244 kubelet[2926]: E0119 09:26:12.973203 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.973538 kubelet[2926]: E0119 09:26:12.973485 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.973538 kubelet[2926]: W0119 09:26:12.973516 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.973538 kubelet[2926]: E0119 09:26:12.973522 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.974024 kubelet[2926]: E0119 09:26:12.974011 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.974024 kubelet[2926]: W0119 09:26:12.974020 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.974076 kubelet[2926]: E0119 09:26:12.974027 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.974231 kubelet[2926]: E0119 09:26:12.974214 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.974231 kubelet[2926]: W0119 09:26:12.974228 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.974290 kubelet[2926]: E0119 09:26:12.974236 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.974712 kubelet[2926]: E0119 09:26:12.974485 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.974712 kubelet[2926]: W0119 09:26:12.974492 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.974712 kubelet[2926]: E0119 09:26:12.974499 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.974786 kubelet[2926]: E0119 09:26:12.974745 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.974786 kubelet[2926]: W0119 09:26:12.974751 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.974786 kubelet[2926]: E0119 09:26:12.974758 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.974968 kubelet[2926]: E0119 09:26:12.974948 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.974968 kubelet[2926]: W0119 09:26:12.974962 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.974968 kubelet[2926]: E0119 09:26:12.974968 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.975269 kubelet[2926]: E0119 09:26:12.975247 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.975269 kubelet[2926]: W0119 09:26:12.975261 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.975269 kubelet[2926]: E0119 09:26:12.975268 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.975505 kubelet[2926]: E0119 09:26:12.975488 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.975505 kubelet[2926]: W0119 09:26:12.975499 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.975505 kubelet[2926]: E0119 09:26:12.975506 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.975718 kubelet[2926]: E0119 09:26:12.975701 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.975718 kubelet[2926]: W0119 09:26:12.975714 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.975757 kubelet[2926]: E0119 09:26:12.975723 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.976187 kubelet[2926]: E0119 09:26:12.976161 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.976187 kubelet[2926]: W0119 09:26:12.976176 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.976187 kubelet[2926]: E0119 09:26:12.976185 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.977322 kubelet[2926]: E0119 09:26:12.977279 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.977322 kubelet[2926]: W0119 09:26:12.977296 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.977322 kubelet[2926]: E0119 09:26:12.977304 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.984614 kubelet[2926]: E0119 09:26:12.984592 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:12.984614 kubelet[2926]: W0119 09:26:12.984606 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:12.984614 kubelet[2926]: E0119 09:26:12.984616 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:12.985000 audit: BPF prog-id=156 op=LOAD Jan 19 09:26:12.985000 audit: BPF prog-id=157 op=LOAD Jan 19 09:26:12.985000 audit[3454]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3441 pid=3454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.985000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863386434343963343334333462386438643462383163313761333164 Jan 19 09:26:12.985000 audit: BPF prog-id=157 op=UNLOAD Jan 19 09:26:12.985000 audit[3454]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.985000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863386434343963343334333462386438643462383163313761333164 Jan 19 09:26:12.985000 audit: BPF prog-id=158 op=LOAD Jan 19 09:26:12.985000 audit[3454]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3441 pid=3454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.985000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863386434343963343334333462386438643462383163313761333164 Jan 19 09:26:12.985000 audit: BPF prog-id=159 op=LOAD Jan 19 09:26:12.985000 audit[3454]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3441 pid=3454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.985000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863386434343963343334333462386438643462383163313761333164 Jan 19 09:26:12.985000 audit: BPF prog-id=159 op=UNLOAD Jan 19 09:26:12.985000 audit[3454]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.985000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863386434343963343334333462386438643462383163313761333164 Jan 19 09:26:12.986000 audit: BPF prog-id=158 op=UNLOAD Jan 19 09:26:12.986000 audit[3454]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863386434343963343334333462386438643462383163313761333164 Jan 19 09:26:12.986000 audit: BPF prog-id=160 op=LOAD Jan 19 09:26:12.986000 audit[3454]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3441 pid=3454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:12.986000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863386434343963343334333462386438643462383163313761333164 Jan 19 09:26:13.004796 containerd[1676]: time="2026-01-19T09:26:13.004745260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2vxc2,Uid:95c50da7-f0be-44f4-9371-f1b8d2b1156d,Namespace:calico-system,Attempt:0,} returns sandbox id \"8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee\"" Jan 19 09:26:13.464000 audit[3507]: NETFILTER_CFG table=filter:113 family=2 entries=22 op=nft_register_rule pid=3507 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:13.468052 kernel: kauditd_printk_skb: 63 callbacks suppressed Jan 19 09:26:13.468083 kernel: audit: type=1325 audit(1768814773.464:557): table=filter:113 family=2 entries=22 op=nft_register_rule pid=3507 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:13.464000 audit[3507]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe7ed94f30 a2=0 a3=7ffe7ed94f1c items=0 ppid=3032 pid=3507 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:13.474663 kernel: audit: type=1300 audit(1768814773.464:557): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffe7ed94f30 a2=0 a3=7ffe7ed94f1c items=0 ppid=3032 pid=3507 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:13.464000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:13.480975 kernel: audit: type=1327 audit(1768814773.464:557): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:13.474000 audit[3507]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3507 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:13.485116 kernel: audit: type=1325 audit(1768814773.474:558): table=nat:114 family=2 entries=12 op=nft_register_rule pid=3507 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:13.474000 audit[3507]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe7ed94f30 a2=0 a3=0 items=0 ppid=3032 pid=3507 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:13.494757 kernel: audit: type=1300 audit(1768814773.474:558): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe7ed94f30 a2=0 a3=0 items=0 ppid=3032 pid=3507 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:13.494799 kernel: audit: type=1327 audit(1768814773.474:558): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:13.474000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:14.224515 kubelet[2926]: E0119 09:26:14.223889 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:14.885344 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount958269552.mount: Deactivated successfully. Jan 19 09:26:15.884639 containerd[1676]: time="2026-01-19T09:26:15.884580032Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:15.885805 containerd[1676]: time="2026-01-19T09:26:15.885718956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 19 09:26:15.886819 containerd[1676]: time="2026-01-19T09:26:15.886778737Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:15.888961 containerd[1676]: time="2026-01-19T09:26:15.888925442Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:15.889682 containerd[1676]: time="2026-01-19T09:26:15.889471369Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.945665455s" Jan 19 09:26:15.889682 containerd[1676]: time="2026-01-19T09:26:15.889510730Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 19 09:26:15.891223 containerd[1676]: time="2026-01-19T09:26:15.891209559Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 19 09:26:15.908739 containerd[1676]: time="2026-01-19T09:26:15.908696872Z" level=info msg="CreateContainer within sandbox \"aad28f601f9baec9b2a45cd9f9b9ce09240d6c56e8b9599026ffac784e75f247\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 19 09:26:15.917659 containerd[1676]: time="2026-01-19T09:26:15.917615036Z" level=info msg="Container 0903adbf7d087c4e05a238f566757d41ab070daf6bf5e3538a72ffb36db186dc: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:26:15.925782 containerd[1676]: time="2026-01-19T09:26:15.925747420Z" level=info msg="CreateContainer within sandbox \"aad28f601f9baec9b2a45cd9f9b9ce09240d6c56e8b9599026ffac784e75f247\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0903adbf7d087c4e05a238f566757d41ab070daf6bf5e3538a72ffb36db186dc\"" Jan 19 09:26:15.926333 containerd[1676]: time="2026-01-19T09:26:15.926311496Z" level=info msg="StartContainer for \"0903adbf7d087c4e05a238f566757d41ab070daf6bf5e3538a72ffb36db186dc\"" Jan 19 09:26:15.927413 containerd[1676]: time="2026-01-19T09:26:15.927381029Z" level=info msg="connecting to shim 0903adbf7d087c4e05a238f566757d41ab070daf6bf5e3538a72ffb36db186dc" address="unix:///run/containerd/s/0c9e9f80f079008c55fa4bf8492ac2997a5790a5c935fa79d1ddfc5fa5ec9d2b" protocol=ttrpc version=3 Jan 19 09:26:15.948553 systemd[1]: Started cri-containerd-0903adbf7d087c4e05a238f566757d41ab070daf6bf5e3538a72ffb36db186dc.scope - libcontainer container 0903adbf7d087c4e05a238f566757d41ab070daf6bf5e3538a72ffb36db186dc. Jan 19 09:26:15.960000 audit: BPF prog-id=161 op=LOAD Jan 19 09:26:15.964415 kernel: audit: type=1334 audit(1768814775.960:559): prog-id=161 op=LOAD Jan 19 09:26:15.963000 audit: BPF prog-id=162 op=LOAD Jan 19 09:26:15.963000 audit[3521]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3350 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:15.975246 kernel: audit: type=1334 audit(1768814775.963:560): prog-id=162 op=LOAD Jan 19 09:26:15.975331 kernel: audit: type=1300 audit(1768814775.963:560): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3350 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:15.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039303361646266376430383763346530356132333866353636373537 Jan 19 09:26:15.963000 audit: BPF prog-id=162 op=UNLOAD Jan 19 09:26:15.963000 audit[3521]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3350 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:15.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039303361646266376430383763346530356132333866353636373537 Jan 19 09:26:15.983454 kernel: audit: type=1327 audit(1768814775.963:560): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039303361646266376430383763346530356132333866353636373537 Jan 19 09:26:15.963000 audit: BPF prog-id=163 op=LOAD Jan 19 09:26:15.963000 audit[3521]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3350 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:15.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039303361646266376430383763346530356132333866353636373537 Jan 19 09:26:15.963000 audit: BPF prog-id=164 op=LOAD Jan 19 09:26:15.963000 audit[3521]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3350 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:15.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039303361646266376430383763346530356132333866353636373537 Jan 19 09:26:15.963000 audit: BPF prog-id=164 op=UNLOAD Jan 19 09:26:15.963000 audit[3521]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3350 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:15.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039303361646266376430383763346530356132333866353636373537 Jan 19 09:26:15.963000 audit: BPF prog-id=163 op=UNLOAD Jan 19 09:26:15.963000 audit[3521]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3350 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:15.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039303361646266376430383763346530356132333866353636373537 Jan 19 09:26:15.963000 audit: BPF prog-id=165 op=LOAD Jan 19 09:26:15.963000 audit[3521]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3350 pid=3521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:15.963000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039303361646266376430383763346530356132333866353636373537 Jan 19 09:26:16.000916 containerd[1676]: time="2026-01-19T09:26:16.000877033Z" level=info msg="StartContainer for \"0903adbf7d087c4e05a238f566757d41ab070daf6bf5e3538a72ffb36db186dc\" returns successfully" Jan 19 09:26:16.225074 kubelet[2926]: E0119 09:26:16.224681 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:16.385850 kubelet[2926]: E0119 09:26:16.385766 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.385850 kubelet[2926]: W0119 09:26:16.385788 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.385850 kubelet[2926]: E0119 09:26:16.385807 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.386206 kubelet[2926]: E0119 09:26:16.386157 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.386206 kubelet[2926]: W0119 09:26:16.386166 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.386206 kubelet[2926]: E0119 09:26:16.386175 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.386471 kubelet[2926]: E0119 09:26:16.386463 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.386519 kubelet[2926]: W0119 09:26:16.386503 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.386657 kubelet[2926]: E0119 09:26:16.386552 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.386808 kubelet[2926]: E0119 09:26:16.386801 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.386922 kubelet[2926]: W0119 09:26:16.386848 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.386953 kubelet[2926]: E0119 09:26:16.386946 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.387161 kubelet[2926]: E0119 09:26:16.387154 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.387204 kubelet[2926]: W0119 09:26:16.387190 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.387315 kubelet[2926]: E0119 09:26:16.387283 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.388479 kubelet[2926]: E0119 09:26:16.388414 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.388479 kubelet[2926]: W0119 09:26:16.388425 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.388479 kubelet[2926]: E0119 09:26:16.388437 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.388799 kubelet[2926]: E0119 09:26:16.388747 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.388799 kubelet[2926]: W0119 09:26:16.388757 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.388799 kubelet[2926]: E0119 09:26:16.388764 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.389036 kubelet[2926]: E0119 09:26:16.389027 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.389181 kubelet[2926]: W0119 09:26:16.389088 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.389181 kubelet[2926]: E0119 09:26:16.389098 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.389281 kubelet[2926]: E0119 09:26:16.389272 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.389320 kubelet[2926]: W0119 09:26:16.389313 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.389352 kubelet[2926]: E0119 09:26:16.389346 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.389650 kubelet[2926]: E0119 09:26:16.389596 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.389650 kubelet[2926]: W0119 09:26:16.389605 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.389650 kubelet[2926]: E0119 09:26:16.389612 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.390264 kubelet[2926]: E0119 09:26:16.390199 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.390264 kubelet[2926]: W0119 09:26:16.390208 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.390264 kubelet[2926]: E0119 09:26:16.390216 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.390593 kubelet[2926]: E0119 09:26:16.390505 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.390593 kubelet[2926]: W0119 09:26:16.390522 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.390593 kubelet[2926]: E0119 09:26:16.390529 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.390866 kubelet[2926]: E0119 09:26:16.390807 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.390866 kubelet[2926]: W0119 09:26:16.390815 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.390866 kubelet[2926]: E0119 09:26:16.390822 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.391379 kubelet[2926]: E0119 09:26:16.391242 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.391379 kubelet[2926]: W0119 09:26:16.391337 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.391379 kubelet[2926]: E0119 09:26:16.391344 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.391736 kubelet[2926]: E0119 09:26:16.391721 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.392448 kubelet[2926]: W0119 09:26:16.392436 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.392505 kubelet[2926]: E0119 09:26:16.392496 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.395123 kubelet[2926]: E0119 09:26:16.395081 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.395123 kubelet[2926]: W0119 09:26:16.395092 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.395123 kubelet[2926]: E0119 09:26:16.395103 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.396554 kubelet[2926]: E0119 09:26:16.396524 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.396554 kubelet[2926]: W0119 09:26:16.396534 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.396554 kubelet[2926]: E0119 09:26:16.396543 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.396914 kubelet[2926]: E0119 09:26:16.396889 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.396914 kubelet[2926]: W0119 09:26:16.396898 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.396914 kubelet[2926]: E0119 09:26:16.396905 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.397428 kubelet[2926]: E0119 09:26:16.397229 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.397428 kubelet[2926]: W0119 09:26:16.397405 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.397428 kubelet[2926]: E0119 09:26:16.397415 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.399443 kubelet[2926]: E0119 09:26:16.397746 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.399443 kubelet[2926]: W0119 09:26:16.399413 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.399443 kubelet[2926]: E0119 09:26:16.399426 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.400125 kubelet[2926]: E0119 09:26:16.400036 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.400180 kubelet[2926]: W0119 09:26:16.400162 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.400180 kubelet[2926]: E0119 09:26:16.400171 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.400719 kubelet[2926]: E0119 09:26:16.400687 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.400719 kubelet[2926]: W0119 09:26:16.400696 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.400719 kubelet[2926]: E0119 09:26:16.400703 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.401585 kubelet[2926]: E0119 09:26:16.401183 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.401585 kubelet[2926]: W0119 09:26:16.401206 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.401585 kubelet[2926]: E0119 09:26:16.401213 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.401998 kubelet[2926]: E0119 09:26:16.401834 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.401998 kubelet[2926]: W0119 09:26:16.401842 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.401998 kubelet[2926]: E0119 09:26:16.401849 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.402320 kubelet[2926]: E0119 09:26:16.402209 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.402320 kubelet[2926]: W0119 09:26:16.402217 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.402320 kubelet[2926]: E0119 09:26:16.402224 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.404039 kubelet[2926]: E0119 09:26:16.404010 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.404039 kubelet[2926]: W0119 09:26:16.404021 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.404039 kubelet[2926]: E0119 09:26:16.404029 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.404406 kubelet[2926]: E0119 09:26:16.404287 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.404453 kubelet[2926]: W0119 09:26:16.404444 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.404490 kubelet[2926]: E0119 09:26:16.404483 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.404692 kubelet[2926]: E0119 09:26:16.404681 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.404763 kubelet[2926]: W0119 09:26:16.404755 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.404797 kubelet[2926]: E0119 09:26:16.404790 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.405064 kubelet[2926]: E0119 09:26:16.405041 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.405064 kubelet[2926]: W0119 09:26:16.405049 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.405064 kubelet[2926]: E0119 09:26:16.405055 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.405424 kubelet[2926]: E0119 09:26:16.405352 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.405424 kubelet[2926]: W0119 09:26:16.405360 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.405424 kubelet[2926]: E0119 09:26:16.405366 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.405729 kubelet[2926]: E0119 09:26:16.405705 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.405729 kubelet[2926]: W0119 09:26:16.405713 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.405729 kubelet[2926]: E0119 09:26:16.405720 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.407521 kubelet[2926]: E0119 09:26:16.407447 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.407521 kubelet[2926]: W0119 09:26:16.407455 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.407521 kubelet[2926]: E0119 09:26:16.407462 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:16.407769 kubelet[2926]: E0119 09:26:16.407737 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:16.407769 kubelet[2926]: W0119 09:26:16.407747 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:16.407769 kubelet[2926]: E0119 09:26:16.407753 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.329265 kubelet[2926]: I0119 09:26:17.329219 2926 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 19 09:26:17.401469 kubelet[2926]: E0119 09:26:17.400503 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.401469 kubelet[2926]: W0119 09:26:17.400602 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.401469 kubelet[2926]: E0119 09:26:17.400639 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.401469 kubelet[2926]: E0119 09:26:17.401312 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.401469 kubelet[2926]: W0119 09:26:17.401333 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.401469 kubelet[2926]: E0119 09:26:17.401360 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.402437 kubelet[2926]: E0119 09:26:17.402356 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.402565 kubelet[2926]: W0119 09:26:17.402466 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.402565 kubelet[2926]: E0119 09:26:17.402491 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.403491 kubelet[2926]: E0119 09:26:17.403435 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.403491 kubelet[2926]: W0119 09:26:17.403460 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.403491 kubelet[2926]: E0119 09:26:17.403485 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.404351 kubelet[2926]: E0119 09:26:17.404316 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.404351 kubelet[2926]: W0119 09:26:17.404346 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.404564 kubelet[2926]: E0119 09:26:17.404369 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.404923 kubelet[2926]: E0119 09:26:17.404893 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.404923 kubelet[2926]: W0119 09:26:17.404917 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.405083 kubelet[2926]: E0119 09:26:17.404939 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.405448 kubelet[2926]: E0119 09:26:17.405359 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.405448 kubelet[2926]: W0119 09:26:17.405385 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.405448 kubelet[2926]: E0119 09:26:17.405438 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.405961 kubelet[2926]: E0119 09:26:17.405921 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.405961 kubelet[2926]: W0119 09:26:17.405954 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.406155 kubelet[2926]: E0119 09:26:17.405978 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.406651 kubelet[2926]: E0119 09:26:17.406571 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.406651 kubelet[2926]: W0119 09:26:17.406597 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.406651 kubelet[2926]: E0119 09:26:17.406618 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.407162 kubelet[2926]: E0119 09:26:17.407068 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.407162 kubelet[2926]: W0119 09:26:17.407098 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.407162 kubelet[2926]: E0119 09:26:17.407119 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.407712 kubelet[2926]: E0119 09:26:17.407636 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.407807 kubelet[2926]: W0119 09:26:17.407735 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.407807 kubelet[2926]: E0119 09:26:17.407759 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.408433 kubelet[2926]: E0119 09:26:17.408223 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.408433 kubelet[2926]: W0119 09:26:17.408250 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.408433 kubelet[2926]: E0119 09:26:17.408272 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.408943 kubelet[2926]: E0119 09:26:17.408907 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.408943 kubelet[2926]: W0119 09:26:17.408930 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.409292 kubelet[2926]: E0119 09:26:17.408952 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.409572 kubelet[2926]: E0119 09:26:17.409445 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.409572 kubelet[2926]: W0119 09:26:17.409469 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.409572 kubelet[2926]: E0119 09:26:17.409491 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.410130 kubelet[2926]: E0119 09:26:17.410070 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.410130 kubelet[2926]: W0119 09:26:17.410102 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.410130 kubelet[2926]: E0119 09:26:17.410123 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.410886 kubelet[2926]: E0119 09:26:17.410816 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.410886 kubelet[2926]: W0119 09:26:17.410840 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.411054 kubelet[2926]: E0119 09:26:17.410925 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.411654 kubelet[2926]: E0119 09:26:17.411578 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.411654 kubelet[2926]: W0119 09:26:17.411604 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.411654 kubelet[2926]: E0119 09:26:17.411625 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.412290 kubelet[2926]: E0119 09:26:17.412253 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.412290 kubelet[2926]: W0119 09:26:17.412284 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.412616 kubelet[2926]: E0119 09:26:17.412306 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.413464 kubelet[2926]: E0119 09:26:17.413364 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.413464 kubelet[2926]: W0119 09:26:17.413435 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.413464 kubelet[2926]: E0119 09:26:17.413459 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.414648 kubelet[2926]: E0119 09:26:17.414009 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.414648 kubelet[2926]: W0119 09:26:17.414027 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.414648 kubelet[2926]: E0119 09:26:17.414047 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.414648 kubelet[2926]: E0119 09:26:17.414646 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.414890 kubelet[2926]: W0119 09:26:17.414666 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.414890 kubelet[2926]: E0119 09:26:17.414687 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.415332 kubelet[2926]: E0119 09:26:17.415268 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.415332 kubelet[2926]: W0119 09:26:17.415296 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.415332 kubelet[2926]: E0119 09:26:17.415318 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.416083 kubelet[2926]: E0119 09:26:17.416016 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.416083 kubelet[2926]: W0119 09:26:17.416040 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.416083 kubelet[2926]: E0119 09:26:17.416061 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.417271 kubelet[2926]: E0119 09:26:17.417211 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.417271 kubelet[2926]: W0119 09:26:17.417248 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.417271 kubelet[2926]: E0119 09:26:17.417271 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.418160 kubelet[2926]: E0119 09:26:17.418070 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.418160 kubelet[2926]: W0119 09:26:17.418097 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.418160 kubelet[2926]: E0119 09:26:17.418118 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.418877 kubelet[2926]: E0119 09:26:17.418851 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.418877 kubelet[2926]: W0119 09:26:17.418876 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.419041 kubelet[2926]: E0119 09:26:17.418898 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.419568 kubelet[2926]: E0119 09:26:17.419514 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.419568 kubelet[2926]: W0119 09:26:17.419559 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.419788 kubelet[2926]: E0119 09:26:17.419581 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.420465 kubelet[2926]: E0119 09:26:17.420359 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.420465 kubelet[2926]: W0119 09:26:17.420386 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.420465 kubelet[2926]: E0119 09:26:17.420441 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.421021 kubelet[2926]: E0119 09:26:17.420966 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.421021 kubelet[2926]: W0119 09:26:17.420997 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.421021 kubelet[2926]: E0119 09:26:17.421018 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.421621 kubelet[2926]: E0119 09:26:17.421574 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.421621 kubelet[2926]: W0119 09:26:17.421600 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.421787 kubelet[2926]: E0119 09:26:17.421620 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.422185 kubelet[2926]: E0119 09:26:17.422141 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.422185 kubelet[2926]: W0119 09:26:17.422166 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.422385 kubelet[2926]: E0119 09:26:17.422187 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.422749 kubelet[2926]: E0119 09:26:17.422709 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.422749 kubelet[2926]: W0119 09:26:17.422734 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.422906 kubelet[2926]: E0119 09:26:17.422755 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.423672 kubelet[2926]: E0119 09:26:17.423629 2926 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 19 09:26:17.423672 kubelet[2926]: W0119 09:26:17.423654 2926 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 19 09:26:17.423806 kubelet[2926]: E0119 09:26:17.423675 2926 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 19 09:26:17.820573 containerd[1676]: time="2026-01-19T09:26:17.820477636Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:17.821916 containerd[1676]: time="2026-01-19T09:26:17.821857401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:17.823336 containerd[1676]: time="2026-01-19T09:26:17.823264525Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:17.825886 containerd[1676]: time="2026-01-19T09:26:17.825812561Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:17.826606 containerd[1676]: time="2026-01-19T09:26:17.826563418Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.935281218s" Jan 19 09:26:17.826669 containerd[1676]: time="2026-01-19T09:26:17.826604669Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 19 09:26:17.833559 containerd[1676]: time="2026-01-19T09:26:17.833490460Z" level=info msg="CreateContainer within sandbox \"8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 19 09:26:17.846411 containerd[1676]: time="2026-01-19T09:26:17.846070540Z" level=info msg="Container 73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:26:17.866686 containerd[1676]: time="2026-01-19T09:26:17.866622782Z" level=info msg="CreateContainer within sandbox \"8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd\"" Jan 19 09:26:17.867687 containerd[1676]: time="2026-01-19T09:26:17.867509131Z" level=info msg="StartContainer for \"73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd\"" Jan 19 09:26:17.869924 containerd[1676]: time="2026-01-19T09:26:17.869881416Z" level=info msg="connecting to shim 73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd" address="unix:///run/containerd/s/595d62d79192058cd2ab87446e4025637da4cac425267a38771d7e8db85ee68c" protocol=ttrpc version=3 Jan 19 09:26:17.903713 systemd[1]: Started cri-containerd-73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd.scope - libcontainer container 73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd. Jan 19 09:26:17.978000 audit: BPF prog-id=166 op=LOAD Jan 19 09:26:17.978000 audit[3629]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3441 pid=3629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:17.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733353739656335383137343635336236636138383934613939343433 Jan 19 09:26:17.978000 audit: BPF prog-id=167 op=LOAD Jan 19 09:26:17.978000 audit[3629]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3441 pid=3629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:17.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733353739656335383137343635336236636138383934613939343433 Jan 19 09:26:17.978000 audit: BPF prog-id=167 op=UNLOAD Jan 19 09:26:17.978000 audit[3629]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:17.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733353739656335383137343635336236636138383934613939343433 Jan 19 09:26:17.978000 audit: BPF prog-id=166 op=UNLOAD Jan 19 09:26:17.978000 audit[3629]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:17.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733353739656335383137343635336236636138383934613939343433 Jan 19 09:26:17.978000 audit: BPF prog-id=168 op=LOAD Jan 19 09:26:17.978000 audit[3629]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3441 pid=3629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:17.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733353739656335383137343635336236636138383934613939343433 Jan 19 09:26:18.026707 containerd[1676]: time="2026-01-19T09:26:18.026663767Z" level=info msg="StartContainer for \"73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd\" returns successfully" Jan 19 09:26:18.039547 systemd[1]: cri-containerd-73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd.scope: Deactivated successfully. Jan 19 09:26:18.041000 audit: BPF prog-id=168 op=UNLOAD Jan 19 09:26:18.043757 containerd[1676]: time="2026-01-19T09:26:18.043697892Z" level=info msg="received container exit event container_id:\"73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd\" id:\"73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd\" pid:3641 exited_at:{seconds:1768814778 nanos:42146317}" Jan 19 09:26:18.093605 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-73579ec58174653b6ca8894a994439237ec4dc97257dc71e0b873e74964714dd-rootfs.mount: Deactivated successfully. Jan 19 09:26:18.225336 kubelet[2926]: E0119 09:26:18.224581 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:18.337293 containerd[1676]: time="2026-01-19T09:26:18.337246925Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 19 09:26:18.362093 kubelet[2926]: I0119 09:26:18.361485 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8666647ccf-bkmvf" podStartSLOduration=3.414714032 podStartE2EDuration="6.3614612s" podCreationTimestamp="2026-01-19 09:26:12 +0000 UTC" firstStartedPulling="2026-01-19 09:26:12.943595221 +0000 UTC m=+22.840413972" lastFinishedPulling="2026-01-19 09:26:15.890342379 +0000 UTC m=+25.787161140" observedRunningTime="2026-01-19 09:26:16.340542709 +0000 UTC m=+26.237361460" watchObservedRunningTime="2026-01-19 09:26:18.3614612 +0000 UTC m=+28.258279991" Jan 19 09:26:20.226110 kubelet[2926]: E0119 09:26:20.226049 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:22.227106 kubelet[2926]: E0119 09:26:22.227041 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:22.645946 containerd[1676]: time="2026-01-19T09:26:22.645872827Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:22.646945 containerd[1676]: time="2026-01-19T09:26:22.646848195Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 19 09:26:22.647630 containerd[1676]: time="2026-01-19T09:26:22.647604211Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:22.649474 containerd[1676]: time="2026-01-19T09:26:22.649387344Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:22.650119 containerd[1676]: time="2026-01-19T09:26:22.649740137Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 4.312078768s" Jan 19 09:26:22.650119 containerd[1676]: time="2026-01-19T09:26:22.649763247Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 19 09:26:22.654159 containerd[1676]: time="2026-01-19T09:26:22.654116711Z" level=info msg="CreateContainer within sandbox \"8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 19 09:26:22.664497 containerd[1676]: time="2026-01-19T09:26:22.662738527Z" level=info msg="Container b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:26:22.682585 containerd[1676]: time="2026-01-19T09:26:22.682530130Z" level=info msg="CreateContainer within sandbox \"8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b\"" Jan 19 09:26:22.683564 containerd[1676]: time="2026-01-19T09:26:22.683472907Z" level=info msg="StartContainer for \"b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b\"" Jan 19 09:26:22.685070 containerd[1676]: time="2026-01-19T09:26:22.684785327Z" level=info msg="connecting to shim b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b" address="unix:///run/containerd/s/595d62d79192058cd2ab87446e4025637da4cac425267a38771d7e8db85ee68c" protocol=ttrpc version=3 Jan 19 09:26:22.707668 systemd[1]: Started cri-containerd-b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b.scope - libcontainer container b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b. Jan 19 09:26:22.780406 kernel: kauditd_printk_skb: 34 callbacks suppressed Jan 19 09:26:22.780546 kernel: audit: type=1334 audit(1768814782.777:573): prog-id=169 op=LOAD Jan 19 09:26:22.777000 audit: BPF prog-id=169 op=LOAD Jan 19 09:26:22.777000 audit[3687]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=3441 pid=3687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:22.785481 kernel: audit: type=1300 audit(1768814782.777:573): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=3441 pid=3687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:22.792484 kernel: audit: type=1327 audit(1768814782.777:573): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235626531666262373663646363393839323231303063353134313161 Jan 19 09:26:22.777000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235626531666262373663646363393839323231303063353134313161 Jan 19 09:26:22.797978 kernel: audit: type=1334 audit(1768814782.777:574): prog-id=170 op=LOAD Jan 19 09:26:22.777000 audit: BPF prog-id=170 op=LOAD Jan 19 09:26:22.777000 audit[3687]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=3441 pid=3687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:22.800843 kernel: audit: type=1300 audit(1768814782.777:574): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=3441 pid=3687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:22.777000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235626531666262373663646363393839323231303063353134313161 Jan 19 09:26:22.812521 kernel: audit: type=1327 audit(1768814782.777:574): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235626531666262373663646363393839323231303063353134313161 Jan 19 09:26:22.777000 audit: BPF prog-id=170 op=UNLOAD Jan 19 09:26:22.816459 kernel: audit: type=1334 audit(1768814782.777:575): prog-id=170 op=UNLOAD Jan 19 09:26:22.777000 audit[3687]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:22.777000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235626531666262373663646363393839323231303063353134313161 Jan 19 09:26:22.826004 kernel: audit: type=1300 audit(1768814782.777:575): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:22.826096 kernel: audit: type=1327 audit(1768814782.777:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235626531666262373663646363393839323231303063353134313161 Jan 19 09:26:22.831877 kernel: audit: type=1334 audit(1768814782.777:576): prog-id=169 op=UNLOAD Jan 19 09:26:22.777000 audit: BPF prog-id=169 op=UNLOAD Jan 19 09:26:22.777000 audit[3687]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:22.777000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235626531666262373663646363393839323231303063353134313161 Jan 19 09:26:22.777000 audit: BPF prog-id=171 op=LOAD Jan 19 09:26:22.777000 audit[3687]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=3441 pid=3687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:22.777000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235626531666262373663646363393839323231303063353134313161 Jan 19 09:26:22.840780 containerd[1676]: time="2026-01-19T09:26:22.840731248Z" level=info msg="StartContainer for \"b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b\" returns successfully" Jan 19 09:26:23.314381 containerd[1676]: time="2026-01-19T09:26:23.314333178Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 19 09:26:23.316645 systemd[1]: cri-containerd-b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b.scope: Deactivated successfully. Jan 19 09:26:23.316976 systemd[1]: cri-containerd-b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b.scope: Consumed 501ms CPU time, 197.3M memory peak, 171.3M written to disk. Jan 19 09:26:23.318701 containerd[1676]: time="2026-01-19T09:26:23.318635479Z" level=info msg="received container exit event container_id:\"b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b\" id:\"b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b\" pid:3701 exited_at:{seconds:1768814783 nanos:318375667}" Jan 19 09:26:23.319000 audit: BPF prog-id=171 op=UNLOAD Jan 19 09:26:23.330736 kubelet[2926]: I0119 09:26:23.330709 2926 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Jan 19 09:26:23.351305 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b5be1fbb76cdcc98922100c51411aa9a8ba951e485703a287561c86778aa361b-rootfs.mount: Deactivated successfully. Jan 19 09:26:23.426850 systemd[1]: Created slice kubepods-burstable-poda95d2224_19eb_41be_ac7f_5930c584a95b.slice - libcontainer container kubepods-burstable-poda95d2224_19eb_41be_ac7f_5930c584a95b.slice. Jan 19 09:26:23.440973 systemd[1]: Created slice kubepods-burstable-podc3af37da_419f_42f6_9360_491eb370d45d.slice - libcontainer container kubepods-burstable-podc3af37da_419f_42f6_9360_491eb370d45d.slice. Jan 19 09:26:23.449465 systemd[1]: Created slice kubepods-besteffort-podc1023597_3f97_435e_a787_6c49b8480f62.slice - libcontainer container kubepods-besteffort-podc1023597_3f97_435e_a787_6c49b8480f62.slice. Jan 19 09:26:23.459841 systemd[1]: Created slice kubepods-besteffort-pod4ce7f23f_45be_4728_a485_c7ae56af8a5c.slice - libcontainer container kubepods-besteffort-pod4ce7f23f_45be_4728_a485_c7ae56af8a5c.slice. Jan 19 09:26:23.462337 kubelet[2926]: I0119 09:26:23.461326 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/37c84e8a-a62d-48e6-91f5-791aabccbba8-calico-apiserver-certs\") pod \"calico-apiserver-558fc744bc-cf5rv\" (UID: \"37c84e8a-a62d-48e6-91f5-791aabccbba8\") " pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" Jan 19 09:26:23.464573 kubelet[2926]: I0119 09:26:23.464343 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-whisker-backend-key-pair\") pod \"whisker-776fcd4f77-vm42h\" (UID: \"2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf\") " pod="calico-system/whisker-776fcd4f77-vm42h" Jan 19 09:26:23.465697 kubelet[2926]: I0119 09:26:23.465666 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-whisker-ca-bundle\") pod \"whisker-776fcd4f77-vm42h\" (UID: \"2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf\") " pod="calico-system/whisker-776fcd4f77-vm42h" Jan 19 09:26:23.465953 kubelet[2926]: I0119 09:26:23.465708 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1023597-3f97-435e-a787-6c49b8480f62-tigera-ca-bundle\") pod \"calico-kube-controllers-6bcfd67b5c-6xwgh\" (UID: \"c1023597-3f97-435e-a787-6c49b8480f62\") " pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" Jan 19 09:26:23.465953 kubelet[2926]: I0119 09:26:23.465734 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x72r9\" (UniqueName: \"kubernetes.io/projected/c1023597-3f97-435e-a787-6c49b8480f62-kube-api-access-x72r9\") pod \"calico-kube-controllers-6bcfd67b5c-6xwgh\" (UID: \"c1023597-3f97-435e-a787-6c49b8480f62\") " pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" Jan 19 09:26:23.465953 kubelet[2926]: I0119 09:26:23.465757 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pm82\" (UniqueName: \"kubernetes.io/projected/911306cd-bd01-4287-9129-4bbfd31bbd16-kube-api-access-2pm82\") pod \"calico-apiserver-67d5ff6dcb-bnwh7\" (UID: \"911306cd-bd01-4287-9129-4bbfd31bbd16\") " pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" Jan 19 09:26:23.465953 kubelet[2926]: I0119 09:26:23.465774 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4703ae1-6e1e-4185-acf2-7e68ee8729a3-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-l9tzs\" (UID: \"d4703ae1-6e1e-4185-acf2-7e68ee8729a3\") " pod="calico-system/goldmane-7c778bb748-l9tzs" Jan 19 09:26:23.466331 kubelet[2926]: I0119 09:26:23.465986 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl4pd\" (UniqueName: \"kubernetes.io/projected/37c84e8a-a62d-48e6-91f5-791aabccbba8-kube-api-access-vl4pd\") pod \"calico-apiserver-558fc744bc-cf5rv\" (UID: \"37c84e8a-a62d-48e6-91f5-791aabccbba8\") " pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" Jan 19 09:26:23.466331 kubelet[2926]: I0119 09:26:23.466014 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wn7s\" (UniqueName: \"kubernetes.io/projected/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-kube-api-access-5wn7s\") pod \"whisker-776fcd4f77-vm42h\" (UID: \"2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf\") " pod="calico-system/whisker-776fcd4f77-vm42h" Jan 19 09:26:23.466331 kubelet[2926]: I0119 09:26:23.466047 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4703ae1-6e1e-4185-acf2-7e68ee8729a3-config\") pod \"goldmane-7c778bb748-l9tzs\" (UID: \"d4703ae1-6e1e-4185-acf2-7e68ee8729a3\") " pod="calico-system/goldmane-7c778bb748-l9tzs" Jan 19 09:26:23.466331 kubelet[2926]: I0119 09:26:23.466064 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/d4703ae1-6e1e-4185-acf2-7e68ee8729a3-goldmane-key-pair\") pod \"goldmane-7c778bb748-l9tzs\" (UID: \"d4703ae1-6e1e-4185-acf2-7e68ee8729a3\") " pod="calico-system/goldmane-7c778bb748-l9tzs" Jan 19 09:26:23.466331 kubelet[2926]: I0119 09:26:23.466084 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a95d2224-19eb-41be-ac7f-5930c584a95b-config-volume\") pod \"coredns-66bc5c9577-bvxq9\" (UID: \"a95d2224-19eb-41be-ac7f-5930c584a95b\") " pod="kube-system/coredns-66bc5c9577-bvxq9" Jan 19 09:26:23.466801 kubelet[2926]: I0119 09:26:23.466104 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bh49\" (UniqueName: \"kubernetes.io/projected/4ce7f23f-45be-4728-a485-c7ae56af8a5c-kube-api-access-4bh49\") pod \"calico-apiserver-67d5ff6dcb-sg9sb\" (UID: \"4ce7f23f-45be-4728-a485-c7ae56af8a5c\") " pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" Jan 19 09:26:23.466801 kubelet[2926]: I0119 09:26:23.466255 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxn9h\" (UniqueName: \"kubernetes.io/projected/d4703ae1-6e1e-4185-acf2-7e68ee8729a3-kube-api-access-rxn9h\") pod \"goldmane-7c778bb748-l9tzs\" (UID: \"d4703ae1-6e1e-4185-acf2-7e68ee8729a3\") " pod="calico-system/goldmane-7c778bb748-l9tzs" Jan 19 09:26:23.466801 kubelet[2926]: I0119 09:26:23.466278 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4ce7f23f-45be-4728-a485-c7ae56af8a5c-calico-apiserver-certs\") pod \"calico-apiserver-67d5ff6dcb-sg9sb\" (UID: \"4ce7f23f-45be-4728-a485-c7ae56af8a5c\") " pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" Jan 19 09:26:23.466801 kubelet[2926]: I0119 09:26:23.466319 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/911306cd-bd01-4287-9129-4bbfd31bbd16-calico-apiserver-certs\") pod \"calico-apiserver-67d5ff6dcb-bnwh7\" (UID: \"911306cd-bd01-4287-9129-4bbfd31bbd16\") " pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" Jan 19 09:26:23.466801 kubelet[2926]: I0119 09:26:23.466342 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n82h\" (UniqueName: \"kubernetes.io/projected/a95d2224-19eb-41be-ac7f-5930c584a95b-kube-api-access-6n82h\") pod \"coredns-66bc5c9577-bvxq9\" (UID: \"a95d2224-19eb-41be-ac7f-5930c584a95b\") " pod="kube-system/coredns-66bc5c9577-bvxq9" Jan 19 09:26:23.467144 kubelet[2926]: I0119 09:26:23.466365 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9kq6\" (UniqueName: \"kubernetes.io/projected/c3af37da-419f-42f6-9360-491eb370d45d-kube-api-access-z9kq6\") pod \"coredns-66bc5c9577-fr6vw\" (UID: \"c3af37da-419f-42f6-9360-491eb370d45d\") " pod="kube-system/coredns-66bc5c9577-fr6vw" Jan 19 09:26:23.469414 kubelet[2926]: I0119 09:26:23.469295 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3af37da-419f-42f6-9360-491eb370d45d-config-volume\") pod \"coredns-66bc5c9577-fr6vw\" (UID: \"c3af37da-419f-42f6-9360-491eb370d45d\") " pod="kube-system/coredns-66bc5c9577-fr6vw" Jan 19 09:26:23.477350 systemd[1]: Created slice kubepods-besteffort-pod911306cd_bd01_4287_9129_4bbfd31bbd16.slice - libcontainer container kubepods-besteffort-pod911306cd_bd01_4287_9129_4bbfd31bbd16.slice. Jan 19 09:26:23.490423 systemd[1]: Created slice kubepods-besteffort-pod37c84e8a_a62d_48e6_91f5_791aabccbba8.slice - libcontainer container kubepods-besteffort-pod37c84e8a_a62d_48e6_91f5_791aabccbba8.slice. Jan 19 09:26:23.499147 systemd[1]: Created slice kubepods-besteffort-podd4703ae1_6e1e_4185_acf2_7e68ee8729a3.slice - libcontainer container kubepods-besteffort-podd4703ae1_6e1e_4185_acf2_7e68ee8729a3.slice. Jan 19 09:26:23.507302 systemd[1]: Created slice kubepods-besteffort-pod2cbb6ebb_e115_4e96_a5c8_4fb0c2c039cf.slice - libcontainer container kubepods-besteffort-pod2cbb6ebb_e115_4e96_a5c8_4fb0c2c039cf.slice. Jan 19 09:26:23.737524 containerd[1676]: time="2026-01-19T09:26:23.737384092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-bvxq9,Uid:a95d2224-19eb-41be-ac7f-5930c584a95b,Namespace:kube-system,Attempt:0,}" Jan 19 09:26:23.751152 containerd[1676]: time="2026-01-19T09:26:23.750923491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-fr6vw,Uid:c3af37da-419f-42f6-9360-491eb370d45d,Namespace:kube-system,Attempt:0,}" Jan 19 09:26:23.760509 containerd[1676]: time="2026-01-19T09:26:23.760444690Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6bcfd67b5c-6xwgh,Uid:c1023597-3f97-435e-a787-6c49b8480f62,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:23.776021 containerd[1676]: time="2026-01-19T09:26:23.775353308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-67d5ff6dcb-sg9sb,Uid:4ce7f23f-45be-4728-a485-c7ae56af8a5c,Namespace:calico-apiserver,Attempt:0,}" Jan 19 09:26:23.785098 containerd[1676]: time="2026-01-19T09:26:23.785063699Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-67d5ff6dcb-bnwh7,Uid:911306cd-bd01-4287-9129-4bbfd31bbd16,Namespace:calico-apiserver,Attempt:0,}" Jan 19 09:26:23.801642 containerd[1676]: time="2026-01-19T09:26:23.801557059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558fc744bc-cf5rv,Uid:37c84e8a-a62d-48e6-91f5-791aabccbba8,Namespace:calico-apiserver,Attempt:0,}" Jan 19 09:26:23.809194 containerd[1676]: time="2026-01-19T09:26:23.809152054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-l9tzs,Uid:d4703ae1-6e1e-4185-acf2-7e68ee8729a3,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:23.814621 containerd[1676]: time="2026-01-19T09:26:23.813386735Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-776fcd4f77-vm42h,Uid:2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:23.917574 containerd[1676]: time="2026-01-19T09:26:23.917525902Z" level=error msg="Failed to destroy network for sandbox \"6b1e438649210a6a27ac17a31d3e0a9460fa08a368d370a7080974b202628605\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:23.924139 containerd[1676]: time="2026-01-19T09:26:23.923616966Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-bvxq9,Uid:a95d2224-19eb-41be-ac7f-5930c584a95b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b1e438649210a6a27ac17a31d3e0a9460fa08a368d370a7080974b202628605\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:23.925073 kubelet[2926]: E0119 09:26:23.924955 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b1e438649210a6a27ac17a31d3e0a9460fa08a368d370a7080974b202628605\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:23.925073 kubelet[2926]: E0119 09:26:23.925028 2926 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b1e438649210a6a27ac17a31d3e0a9460fa08a368d370a7080974b202628605\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-bvxq9" Jan 19 09:26:23.925073 kubelet[2926]: E0119 09:26:23.925050 2926 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b1e438649210a6a27ac17a31d3e0a9460fa08a368d370a7080974b202628605\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-bvxq9" Jan 19 09:26:23.925278 kubelet[2926]: E0119 09:26:23.925110 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-bvxq9_kube-system(a95d2224-19eb-41be-ac7f-5930c584a95b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-bvxq9_kube-system(a95d2224-19eb-41be-ac7f-5930c584a95b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6b1e438649210a6a27ac17a31d3e0a9460fa08a368d370a7080974b202628605\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-bvxq9" podUID="a95d2224-19eb-41be-ac7f-5930c584a95b" Jan 19 09:26:23.982584 containerd[1676]: time="2026-01-19T09:26:23.982531124Z" level=error msg="Failed to destroy network for sandbox \"35342d455786147498cb8a826b9312b08267187257c9c22e758d99be6817e47a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:23.989202 containerd[1676]: time="2026-01-19T09:26:23.988973401Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-67d5ff6dcb-bnwh7,Uid:911306cd-bd01-4287-9129-4bbfd31bbd16,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"35342d455786147498cb8a826b9312b08267187257c9c22e758d99be6817e47a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:23.989912 kubelet[2926]: E0119 09:26:23.989775 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35342d455786147498cb8a826b9312b08267187257c9c22e758d99be6817e47a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:23.989912 kubelet[2926]: E0119 09:26:23.989888 2926 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35342d455786147498cb8a826b9312b08267187257c9c22e758d99be6817e47a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" Jan 19 09:26:23.989912 kubelet[2926]: E0119 09:26:23.989912 2926 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35342d455786147498cb8a826b9312b08267187257c9c22e758d99be6817e47a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" Jan 19 09:26:23.991363 kubelet[2926]: E0119 09:26:23.991325 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-67d5ff6dcb-bnwh7_calico-apiserver(911306cd-bd01-4287-9129-4bbfd31bbd16)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-67d5ff6dcb-bnwh7_calico-apiserver(911306cd-bd01-4287-9129-4bbfd31bbd16)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"35342d455786147498cb8a826b9312b08267187257c9c22e758d99be6817e47a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:26:24.006017 containerd[1676]: time="2026-01-19T09:26:24.005965382Z" level=error msg="Failed to destroy network for sandbox \"831f6e9c475bcebd01a1a2c62d37a2868f24e49b643442c50b99da1397800777\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.013766 containerd[1676]: time="2026-01-19T09:26:24.013726396Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-67d5ff6dcb-sg9sb,Uid:4ce7f23f-45be-4728-a485-c7ae56af8a5c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"831f6e9c475bcebd01a1a2c62d37a2868f24e49b643442c50b99da1397800777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.014062 containerd[1676]: time="2026-01-19T09:26:24.014016558Z" level=error msg="Failed to destroy network for sandbox \"9c51d7d98678cc3d66d38055db4b43e15dc5a047d297d0cf5c2f4a5d420217c2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.014651 kubelet[2926]: E0119 09:26:24.014549 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"831f6e9c475bcebd01a1a2c62d37a2868f24e49b643442c50b99da1397800777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.014651 kubelet[2926]: E0119 09:26:24.014629 2926 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"831f6e9c475bcebd01a1a2c62d37a2868f24e49b643442c50b99da1397800777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" Jan 19 09:26:24.014724 kubelet[2926]: E0119 09:26:24.014656 2926 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"831f6e9c475bcebd01a1a2c62d37a2868f24e49b643442c50b99da1397800777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" Jan 19 09:26:24.014724 kubelet[2926]: E0119 09:26:24.014701 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-67d5ff6dcb-sg9sb_calico-apiserver(4ce7f23f-45be-4728-a485-c7ae56af8a5c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-67d5ff6dcb-sg9sb_calico-apiserver(4ce7f23f-45be-4728-a485-c7ae56af8a5c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"831f6e9c475bcebd01a1a2c62d37a2868f24e49b643442c50b99da1397800777\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:26:24.016485 containerd[1676]: time="2026-01-19T09:26:24.016463804Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6bcfd67b5c-6xwgh,Uid:c1023597-3f97-435e-a787-6c49b8480f62,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c51d7d98678cc3d66d38055db4b43e15dc5a047d297d0cf5c2f4a5d420217c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.017674 kubelet[2926]: E0119 09:26:24.017625 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c51d7d98678cc3d66d38055db4b43e15dc5a047d297d0cf5c2f4a5d420217c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.017803 kubelet[2926]: E0119 09:26:24.017718 2926 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c51d7d98678cc3d66d38055db4b43e15dc5a047d297d0cf5c2f4a5d420217c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" Jan 19 09:26:24.017803 kubelet[2926]: E0119 09:26:24.017743 2926 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c51d7d98678cc3d66d38055db4b43e15dc5a047d297d0cf5c2f4a5d420217c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" Jan 19 09:26:24.017918 kubelet[2926]: E0119 09:26:24.017821 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6bcfd67b5c-6xwgh_calico-system(c1023597-3f97-435e-a787-6c49b8480f62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6bcfd67b5c-6xwgh_calico-system(c1023597-3f97-435e-a787-6c49b8480f62)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9c51d7d98678cc3d66d38055db4b43e15dc5a047d297d0cf5c2f4a5d420217c2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:26:24.030968 containerd[1676]: time="2026-01-19T09:26:24.030901334Z" level=error msg="Failed to destroy network for sandbox \"cd4a07aaf415d91dfb3023fa267fab4c81e9eeebd86b9a07221509dd072a50bf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.032755 containerd[1676]: time="2026-01-19T09:26:24.032732157Z" level=error msg="Failed to destroy network for sandbox \"4cbe58c8d63bd4a4fcff1c52dbd20677cdbe9e5f22e2c5acfd6242b99ade61e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.035365 containerd[1676]: time="2026-01-19T09:26:24.035337104Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-776fcd4f77-vm42h,Uid:2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd4a07aaf415d91dfb3023fa267fab4c81e9eeebd86b9a07221509dd072a50bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.036455 kubelet[2926]: E0119 09:26:24.036373 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd4a07aaf415d91dfb3023fa267fab4c81e9eeebd86b9a07221509dd072a50bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.036455 kubelet[2926]: E0119 09:26:24.036447 2926 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd4a07aaf415d91dfb3023fa267fab4c81e9eeebd86b9a07221509dd072a50bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-776fcd4f77-vm42h" Jan 19 09:26:24.036643 kubelet[2926]: E0119 09:26:24.036470 2926 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd4a07aaf415d91dfb3023fa267fab4c81e9eeebd86b9a07221509dd072a50bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-776fcd4f77-vm42h" Jan 19 09:26:24.036643 kubelet[2926]: E0119 09:26:24.036530 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-776fcd4f77-vm42h_calico-system(2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-776fcd4f77-vm42h_calico-system(2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cd4a07aaf415d91dfb3023fa267fab4c81e9eeebd86b9a07221509dd072a50bf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-776fcd4f77-vm42h" podUID="2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf" Jan 19 09:26:24.039137 containerd[1676]: time="2026-01-19T09:26:24.039114360Z" level=error msg="Failed to destroy network for sandbox \"bc76fc2fb586dda67b9f4175f9e5edf4ece5fa1cc0dc603da5708610b3b1a161\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.040217 containerd[1676]: time="2026-01-19T09:26:24.040177678Z" level=error msg="Failed to destroy network for sandbox \"8fea01aa966f61dd126c23c9cfd5d4e1bd812182e061c8102460ed7c5d235edb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.041371 containerd[1676]: time="2026-01-19T09:26:24.041341366Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-fr6vw,Uid:c3af37da-419f-42f6-9360-491eb370d45d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cbe58c8d63bd4a4fcff1c52dbd20677cdbe9e5f22e2c5acfd6242b99ade61e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.041672 kubelet[2926]: E0119 09:26:24.041623 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cbe58c8d63bd4a4fcff1c52dbd20677cdbe9e5f22e2c5acfd6242b99ade61e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.041858 kubelet[2926]: E0119 09:26:24.041762 2926 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cbe58c8d63bd4a4fcff1c52dbd20677cdbe9e5f22e2c5acfd6242b99ade61e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-fr6vw" Jan 19 09:26:24.041858 kubelet[2926]: E0119 09:26:24.041799 2926 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cbe58c8d63bd4a4fcff1c52dbd20677cdbe9e5f22e2c5acfd6242b99ade61e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-fr6vw" Jan 19 09:26:24.041969 kubelet[2926]: E0119 09:26:24.041950 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-fr6vw_kube-system(c3af37da-419f-42f6-9360-491eb370d45d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-fr6vw_kube-system(c3af37da-419f-42f6-9360-491eb370d45d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4cbe58c8d63bd4a4fcff1c52dbd20677cdbe9e5f22e2c5acfd6242b99ade61e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-fr6vw" podUID="c3af37da-419f-42f6-9360-491eb370d45d" Jan 19 09:26:24.042955 containerd[1676]: time="2026-01-19T09:26:24.042920676Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558fc744bc-cf5rv,Uid:37c84e8a-a62d-48e6-91f5-791aabccbba8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc76fc2fb586dda67b9f4175f9e5edf4ece5fa1cc0dc603da5708610b3b1a161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.043429 kubelet[2926]: E0119 09:26:24.043217 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc76fc2fb586dda67b9f4175f9e5edf4ece5fa1cc0dc603da5708610b3b1a161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.043429 kubelet[2926]: E0119 09:26:24.043259 2926 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc76fc2fb586dda67b9f4175f9e5edf4ece5fa1cc0dc603da5708610b3b1a161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" Jan 19 09:26:24.043429 kubelet[2926]: E0119 09:26:24.043271 2926 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc76fc2fb586dda67b9f4175f9e5edf4ece5fa1cc0dc603da5708610b3b1a161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" Jan 19 09:26:24.043572 kubelet[2926]: E0119 09:26:24.043535 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-558fc744bc-cf5rv_calico-apiserver(37c84e8a-a62d-48e6-91f5-791aabccbba8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-558fc744bc-cf5rv_calico-apiserver(37c84e8a-a62d-48e6-91f5-791aabccbba8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bc76fc2fb586dda67b9f4175f9e5edf4ece5fa1cc0dc603da5708610b3b1a161\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:26:24.044790 containerd[1676]: time="2026-01-19T09:26:24.044709969Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-l9tzs,Uid:d4703ae1-6e1e-4185-acf2-7e68ee8729a3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fea01aa966f61dd126c23c9cfd5d4e1bd812182e061c8102460ed7c5d235edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.045132 kubelet[2926]: E0119 09:26:24.045108 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fea01aa966f61dd126c23c9cfd5d4e1bd812182e061c8102460ed7c5d235edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.045260 kubelet[2926]: E0119 09:26:24.045170 2926 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fea01aa966f61dd126c23c9cfd5d4e1bd812182e061c8102460ed7c5d235edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-l9tzs" Jan 19 09:26:24.045260 kubelet[2926]: E0119 09:26:24.045196 2926 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8fea01aa966f61dd126c23c9cfd5d4e1bd812182e061c8102460ed7c5d235edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-l9tzs" Jan 19 09:26:24.045314 kubelet[2926]: E0119 09:26:24.045259 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-l9tzs_calico-system(d4703ae1-6e1e-4185-acf2-7e68ee8729a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-l9tzs_calico-system(d4703ae1-6e1e-4185-acf2-7e68ee8729a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8fea01aa966f61dd126c23c9cfd5d4e1bd812182e061c8102460ed7c5d235edb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:26:24.231650 systemd[1]: Created slice kubepods-besteffort-podcf97e1c1_8e02_43e7_988e_adabcb45ce04.slice - libcontainer container kubepods-besteffort-podcf97e1c1_8e02_43e7_988e_adabcb45ce04.slice. Jan 19 09:26:24.237052 containerd[1676]: time="2026-01-19T09:26:24.237008629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-km4zx,Uid:cf97e1c1-8e02-43e7-988e-adabcb45ce04,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:24.301693 containerd[1676]: time="2026-01-19T09:26:24.301446791Z" level=error msg="Failed to destroy network for sandbox \"0d4e0c8ef6ebdf74d66b080aa218b6e86cbf20f5712774629dd73246abc23350\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.305082 containerd[1676]: time="2026-01-19T09:26:24.305046445Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-km4zx,Uid:cf97e1c1-8e02-43e7-988e-adabcb45ce04,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d4e0c8ef6ebdf74d66b080aa218b6e86cbf20f5712774629dd73246abc23350\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.305851 kubelet[2926]: E0119 09:26:24.305809 2926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d4e0c8ef6ebdf74d66b080aa218b6e86cbf20f5712774629dd73246abc23350\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 19 09:26:24.305930 kubelet[2926]: E0119 09:26:24.305867 2926 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d4e0c8ef6ebdf74d66b080aa218b6e86cbf20f5712774629dd73246abc23350\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-km4zx" Jan 19 09:26:24.305930 kubelet[2926]: E0119 09:26:24.305887 2926 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d4e0c8ef6ebdf74d66b080aa218b6e86cbf20f5712774629dd73246abc23350\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-km4zx" Jan 19 09:26:24.305992 kubelet[2926]: E0119 09:26:24.305943 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d4e0c8ef6ebdf74d66b080aa218b6e86cbf20f5712774629dd73246abc23350\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:24.360793 containerd[1676]: time="2026-01-19T09:26:24.360695488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 19 09:26:24.665220 systemd[1]: run-netns-cni\x2d39846249\x2dc4cb\x2d757c\x2db8c8\x2d5ce6810e3eb9.mount: Deactivated successfully. Jan 19 09:26:24.665342 systemd[1]: run-netns-cni\x2d68352a72\x2ddae2\x2d2436\x2da017\x2d8f7c7f88348a.mount: Deactivated successfully. Jan 19 09:26:24.665422 systemd[1]: run-netns-cni\x2d032500de\x2d6b57\x2dae07\x2d8171\x2d5bade0cd9e1d.mount: Deactivated successfully. Jan 19 09:26:24.665506 systemd[1]: run-netns-cni\x2de5a655bd\x2d7387\x2dea38\x2d64ee\x2d8af136734928.mount: Deactivated successfully. Jan 19 09:26:27.252757 kubelet[2926]: I0119 09:26:27.252721 2926 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 19 09:26:27.290000 audit[3976]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3976 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:27.290000 audit[3976]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffdc9ceb420 a2=0 a3=7ffdc9ceb40c items=0 ppid=3032 pid=3976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:27.290000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:27.296000 audit[3976]: NETFILTER_CFG table=nat:116 family=2 entries=19 op=nft_register_chain pid=3976 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:27.296000 audit[3976]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffdc9ceb420 a2=0 a3=7ffdc9ceb40c items=0 ppid=3032 pid=3976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:27.296000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:32.020324 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2916244491.mount: Deactivated successfully. Jan 19 09:26:32.044659 containerd[1676]: time="2026-01-19T09:26:32.044600619Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:32.045631 containerd[1676]: time="2026-01-19T09:26:32.045557654Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 19 09:26:32.046478 containerd[1676]: time="2026-01-19T09:26:32.046456197Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:32.048860 containerd[1676]: time="2026-01-19T09:26:32.048416386Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 19 09:26:32.048860 containerd[1676]: time="2026-01-19T09:26:32.048764317Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 7.688023859s" Jan 19 09:26:32.048860 containerd[1676]: time="2026-01-19T09:26:32.048785317Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 19 09:26:32.066053 containerd[1676]: time="2026-01-19T09:26:32.065995004Z" level=info msg="CreateContainer within sandbox \"8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 19 09:26:32.076373 containerd[1676]: time="2026-01-19T09:26:32.076339700Z" level=info msg="Container aa3638bc354f69380b080f8a4e6faa5811e08b051c129c2a1ed90deeb8d594c9: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:26:32.089221 containerd[1676]: time="2026-01-19T09:26:32.089178127Z" level=info msg="CreateContainer within sandbox \"8c8d449c43434b8d8d4b81c17a31d9c86429595be836620c2fc2a01e2d9d7cee\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"aa3638bc354f69380b080f8a4e6faa5811e08b051c129c2a1ed90deeb8d594c9\"" Jan 19 09:26:32.089694 containerd[1676]: time="2026-01-19T09:26:32.089676689Z" level=info msg="StartContainer for \"aa3638bc354f69380b080f8a4e6faa5811e08b051c129c2a1ed90deeb8d594c9\"" Jan 19 09:26:32.091201 containerd[1676]: time="2026-01-19T09:26:32.091178146Z" level=info msg="connecting to shim aa3638bc354f69380b080f8a4e6faa5811e08b051c129c2a1ed90deeb8d594c9" address="unix:///run/containerd/s/595d62d79192058cd2ab87446e4025637da4cac425267a38771d7e8db85ee68c" protocol=ttrpc version=3 Jan 19 09:26:32.131542 systemd[1]: Started cri-containerd-aa3638bc354f69380b080f8a4e6faa5811e08b051c129c2a1ed90deeb8d594c9.scope - libcontainer container aa3638bc354f69380b080f8a4e6faa5811e08b051c129c2a1ed90deeb8d594c9. Jan 19 09:26:32.184693 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 19 09:26:32.184808 kernel: audit: type=1334 audit(1768814792.177:581): prog-id=172 op=LOAD Jan 19 09:26:32.177000 audit: BPF prog-id=172 op=LOAD Jan 19 09:26:32.177000 audit[3986]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=3441 pid=3986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:32.194539 kernel: audit: type=1300 audit(1768814792.177:581): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=3441 pid=3986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:32.177000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161333633386263333534663639333830623038306638613465366661 Jan 19 09:26:32.205469 kernel: audit: type=1327 audit(1768814792.177:581): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161333633386263333534663639333830623038306638613465366661 Jan 19 09:26:32.205514 kernel: audit: type=1334 audit(1768814792.183:582): prog-id=173 op=LOAD Jan 19 09:26:32.183000 audit: BPF prog-id=173 op=LOAD Jan 19 09:26:32.208453 kernel: audit: type=1300 audit(1768814792.183:582): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=3441 pid=3986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:32.183000 audit[3986]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=3441 pid=3986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:32.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161333633386263333534663639333830623038306638613465366661 Jan 19 09:26:32.214002 kernel: audit: type=1327 audit(1768814792.183:582): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161333633386263333534663639333830623038306638613465366661 Jan 19 09:26:32.183000 audit: BPF prog-id=173 op=UNLOAD Jan 19 09:26:32.218889 kernel: audit: type=1334 audit(1768814792.183:583): prog-id=173 op=UNLOAD Jan 19 09:26:32.183000 audit[3986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:32.221427 kernel: audit: type=1300 audit(1768814792.183:583): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:32.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161333633386263333534663639333830623038306638613465366661 Jan 19 09:26:32.226942 kernel: audit: type=1327 audit(1768814792.183:583): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161333633386263333534663639333830623038306638613465366661 Jan 19 09:26:32.183000 audit: BPF prog-id=172 op=UNLOAD Jan 19 09:26:32.183000 audit[3986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3441 pid=3986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:32.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161333633386263333534663639333830623038306638613465366661 Jan 19 09:26:32.183000 audit: BPF prog-id=174 op=LOAD Jan 19 09:26:32.183000 audit[3986]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=3441 pid=3986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:32.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161333633386263333534663639333830623038306638613465366661 Jan 19 09:26:32.235448 kernel: audit: type=1334 audit(1768814792.183:584): prog-id=172 op=UNLOAD Jan 19 09:26:32.253792 containerd[1676]: time="2026-01-19T09:26:32.253749466Z" level=info msg="StartContainer for \"aa3638bc354f69380b080f8a4e6faa5811e08b051c129c2a1ed90deeb8d594c9\" returns successfully" Jan 19 09:26:32.387145 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 19 09:26:32.387253 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 19 09:26:32.431676 kubelet[2926]: I0119 09:26:32.431468 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-2vxc2" podStartSLOduration=1.388748971 podStartE2EDuration="20.431455644s" podCreationTimestamp="2026-01-19 09:26:12 +0000 UTC" firstStartedPulling="2026-01-19 09:26:13.006913968 +0000 UTC m=+22.903732719" lastFinishedPulling="2026-01-19 09:26:32.049620631 +0000 UTC m=+41.946439392" observedRunningTime="2026-01-19 09:26:32.429090553 +0000 UTC m=+42.325909314" watchObservedRunningTime="2026-01-19 09:26:32.431455644 +0000 UTC m=+42.328274405" Jan 19 09:26:32.641739 kubelet[2926]: I0119 09:26:32.641631 2926 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-whisker-ca-bundle\") pod \"2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf\" (UID: \"2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf\") " Jan 19 09:26:32.641739 kubelet[2926]: I0119 09:26:32.641674 2926 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-whisker-backend-key-pair\") pod \"2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf\" (UID: \"2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf\") " Jan 19 09:26:32.641739 kubelet[2926]: I0119 09:26:32.641687 2926 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wn7s\" (UniqueName: \"kubernetes.io/projected/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-kube-api-access-5wn7s\") pod \"2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf\" (UID: \"2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf\") " Jan 19 09:26:32.642285 kubelet[2926]: I0119 09:26:32.642263 2926 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf" (UID: "2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 19 09:26:32.647139 kubelet[2926]: I0119 09:26:32.647117 2926 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf" (UID: "2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 19 09:26:32.647746 kubelet[2926]: I0119 09:26:32.647720 2926 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-kube-api-access-5wn7s" (OuterVolumeSpecName: "kube-api-access-5wn7s") pod "2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf" (UID: "2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf"). InnerVolumeSpecName "kube-api-access-5wn7s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 19 09:26:32.742668 kubelet[2926]: I0119 09:26:32.742619 2926 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-whisker-ca-bundle\") on node \"ci-4580-0-0-p-637e605ed7\" DevicePath \"\"" Jan 19 09:26:32.742847 kubelet[2926]: I0119 09:26:32.742721 2926 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-whisker-backend-key-pair\") on node \"ci-4580-0-0-p-637e605ed7\" DevicePath \"\"" Jan 19 09:26:32.742847 kubelet[2926]: I0119 09:26:32.742731 2926 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5wn7s\" (UniqueName: \"kubernetes.io/projected/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf-kube-api-access-5wn7s\") on node \"ci-4580-0-0-p-637e605ed7\" DevicePath \"\"" Jan 19 09:26:33.022207 systemd[1]: var-lib-kubelet-pods-2cbb6ebb\x2de115\x2d4e96\x2da5c8\x2d4fb0c2c039cf-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5wn7s.mount: Deactivated successfully. Jan 19 09:26:33.023247 systemd[1]: var-lib-kubelet-pods-2cbb6ebb\x2de115\x2d4e96\x2da5c8\x2d4fb0c2c039cf-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 19 09:26:33.412071 systemd[1]: Removed slice kubepods-besteffort-pod2cbb6ebb_e115_4e96_a5c8_4fb0c2c039cf.slice - libcontainer container kubepods-besteffort-pod2cbb6ebb_e115_4e96_a5c8_4fb0c2c039cf.slice. Jan 19 09:26:33.603174 systemd[1]: Created slice kubepods-besteffort-pod61f2280f_1a00_4605_a568_ed8af0fdcb06.slice - libcontainer container kubepods-besteffort-pod61f2280f_1a00_4605_a568_ed8af0fdcb06.slice. Jan 19 09:26:33.650315 kubelet[2926]: I0119 09:26:33.650193 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgvcb\" (UniqueName: \"kubernetes.io/projected/61f2280f-1a00-4605-a568-ed8af0fdcb06-kube-api-access-qgvcb\") pod \"whisker-5cb97b98cc-snlsk\" (UID: \"61f2280f-1a00-4605-a568-ed8af0fdcb06\") " pod="calico-system/whisker-5cb97b98cc-snlsk" Jan 19 09:26:33.651169 kubelet[2926]: I0119 09:26:33.650442 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61f2280f-1a00-4605-a568-ed8af0fdcb06-whisker-ca-bundle\") pod \"whisker-5cb97b98cc-snlsk\" (UID: \"61f2280f-1a00-4605-a568-ed8af0fdcb06\") " pod="calico-system/whisker-5cb97b98cc-snlsk" Jan 19 09:26:33.651169 kubelet[2926]: I0119 09:26:33.650495 2926 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/61f2280f-1a00-4605-a568-ed8af0fdcb06-whisker-backend-key-pair\") pod \"whisker-5cb97b98cc-snlsk\" (UID: \"61f2280f-1a00-4605-a568-ed8af0fdcb06\") " pod="calico-system/whisker-5cb97b98cc-snlsk" Jan 19 09:26:33.914180 containerd[1676]: time="2026-01-19T09:26:33.914116771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cb97b98cc-snlsk,Uid:61f2280f-1a00-4605-a568-ed8af0fdcb06,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:34.118227 systemd-networkd[1554]: cali5c42309fb78: Link UP Jan 19 09:26:34.119279 systemd-networkd[1554]: cali5c42309fb78: Gained carrier Jan 19 09:26:34.145201 containerd[1676]: 2026-01-19 09:26:33.964 [INFO][4183] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 19 09:26:34.145201 containerd[1676]: 2026-01-19 09:26:34.025 [INFO][4183] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0 whisker-5cb97b98cc- calico-system 61f2280f-1a00-4605-a568-ed8af0fdcb06 940 0 2026-01-19 09:26:33 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5cb97b98cc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4580-0-0-p-637e605ed7 whisker-5cb97b98cc-snlsk eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali5c42309fb78 [] [] }} ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Namespace="calico-system" Pod="whisker-5cb97b98cc-snlsk" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-" Jan 19 09:26:34.145201 containerd[1676]: 2026-01-19 09:26:34.025 [INFO][4183] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Namespace="calico-system" Pod="whisker-5cb97b98cc-snlsk" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" Jan 19 09:26:34.145201 containerd[1676]: 2026-01-19 09:26:34.052 [INFO][4195] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" HandleID="k8s-pod-network.d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Workload="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" Jan 19 09:26:34.146301 containerd[1676]: 2026-01-19 09:26:34.052 [INFO][4195] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" HandleID="k8s-pod-network.d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Workload="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00048e0e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-637e605ed7", "pod":"whisker-5cb97b98cc-snlsk", "timestamp":"2026-01-19 09:26:34.052464423 +0000 UTC"}, Hostname:"ci-4580-0-0-p-637e605ed7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 19 09:26:34.146301 containerd[1676]: 2026-01-19 09:26:34.052 [INFO][4195] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 19 09:26:34.146301 containerd[1676]: 2026-01-19 09:26:34.052 [INFO][4195] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 19 09:26:34.146301 containerd[1676]: 2026-01-19 09:26:34.052 [INFO][4195] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-637e605ed7' Jan 19 09:26:34.146301 containerd[1676]: 2026-01-19 09:26:34.066 [INFO][4195] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:34.146301 containerd[1676]: 2026-01-19 09:26:34.071 [INFO][4195] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:34.146301 containerd[1676]: 2026-01-19 09:26:34.074 [INFO][4195] ipam/ipam.go 511: Trying affinity for 192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:34.146301 containerd[1676]: 2026-01-19 09:26:34.076 [INFO][4195] ipam/ipam.go 158: Attempting to load block cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:34.146301 containerd[1676]: 2026-01-19 09:26:34.079 [INFO][4195] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:34.147608 containerd[1676]: 2026-01-19 09:26:34.080 [INFO][4195] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.30.0/26 handle="k8s-pod-network.d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:34.147608 containerd[1676]: 2026-01-19 09:26:34.084 [INFO][4195] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70 Jan 19 09:26:34.147608 containerd[1676]: 2026-01-19 09:26:34.091 [INFO][4195] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.30.0/26 handle="k8s-pod-network.d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:34.147608 containerd[1676]: 2026-01-19 09:26:34.097 [INFO][4195] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.30.1/26] block=192.168.30.0/26 handle="k8s-pod-network.d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:34.147608 containerd[1676]: 2026-01-19 09:26:34.097 [INFO][4195] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.30.1/26] handle="k8s-pod-network.d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:34.147608 containerd[1676]: 2026-01-19 09:26:34.097 [INFO][4195] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 19 09:26:34.147608 containerd[1676]: 2026-01-19 09:26:34.097 [INFO][4195] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.30.1/26] IPv6=[] ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" HandleID="k8s-pod-network.d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Workload="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" Jan 19 09:26:34.147743 containerd[1676]: 2026-01-19 09:26:34.102 [INFO][4183] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Namespace="calico-system" Pod="whisker-5cb97b98cc-snlsk" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0", GenerateName:"whisker-5cb97b98cc-", Namespace:"calico-system", SelfLink:"", UID:"61f2280f-1a00-4605-a568-ed8af0fdcb06", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5cb97b98cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"", Pod:"whisker-5cb97b98cc-snlsk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.30.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5c42309fb78", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:34.147743 containerd[1676]: 2026-01-19 09:26:34.102 [INFO][4183] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.1/32] ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Namespace="calico-system" Pod="whisker-5cb97b98cc-snlsk" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" Jan 19 09:26:34.147810 containerd[1676]: 2026-01-19 09:26:34.102 [INFO][4183] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5c42309fb78 ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Namespace="calico-system" Pod="whisker-5cb97b98cc-snlsk" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" Jan 19 09:26:34.147810 containerd[1676]: 2026-01-19 09:26:34.118 [INFO][4183] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Namespace="calico-system" Pod="whisker-5cb97b98cc-snlsk" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" Jan 19 09:26:34.147844 containerd[1676]: 2026-01-19 09:26:34.118 [INFO][4183] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Namespace="calico-system" Pod="whisker-5cb97b98cc-snlsk" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0", GenerateName:"whisker-5cb97b98cc-", Namespace:"calico-system", SelfLink:"", UID:"61f2280f-1a00-4605-a568-ed8af0fdcb06", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5cb97b98cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70", Pod:"whisker-5cb97b98cc-snlsk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.30.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5c42309fb78", MAC:"7a:60:65:e2:c9:11", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:34.147881 containerd[1676]: 2026-01-19 09:26:34.142 [INFO][4183] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" Namespace="calico-system" Pod="whisker-5cb97b98cc-snlsk" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-whisker--5cb97b98cc--snlsk-eth0" Jan 19 09:26:34.204595 containerd[1676]: time="2026-01-19T09:26:34.204458910Z" level=info msg="connecting to shim d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70" address="unix:///run/containerd/s/043f30cca792bbf7299293d64dd8fcf3ae929cfd32a58fe5a4015f38765de8b2" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:34.227861 kubelet[2926]: I0119 09:26:34.227633 2926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf" path="/var/lib/kubelet/pods/2cbb6ebb-e115-4e96-a5c8-4fb0c2c039cf/volumes" Jan 19 09:26:34.229000 audit: BPF prog-id=175 op=LOAD Jan 19 09:26:34.229000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe4b6310f0 a2=98 a3=1fffffffffffffff items=0 ppid=4115 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.229000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 19 09:26:34.229000 audit: BPF prog-id=175 op=UNLOAD Jan 19 09:26:34.229000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe4b6310c0 a3=0 items=0 ppid=4115 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.229000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 19 09:26:34.229000 audit: BPF prog-id=176 op=LOAD Jan 19 09:26:34.229000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe4b630fd0 a2=94 a3=3 items=0 ppid=4115 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.229000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 19 09:26:34.229000 audit: BPF prog-id=176 op=UNLOAD Jan 19 09:26:34.229000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe4b630fd0 a2=94 a3=3 items=0 ppid=4115 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.229000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 19 09:26:34.229000 audit: BPF prog-id=177 op=LOAD Jan 19 09:26:34.229000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe4b631010 a2=94 a3=7ffe4b6311f0 items=0 ppid=4115 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.229000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 19 09:26:34.229000 audit: BPF prog-id=177 op=UNLOAD Jan 19 09:26:34.229000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffe4b631010 a2=94 a3=7ffe4b6311f0 items=0 ppid=4115 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.229000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 19 09:26:34.231000 audit: BPF prog-id=178 op=LOAD Jan 19 09:26:34.231000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffcd2de1f80 a2=98 a3=3 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.231000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.231000 audit: BPF prog-id=178 op=UNLOAD Jan 19 09:26:34.231000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffcd2de1f50 a3=0 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.231000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.233000 audit: BPF prog-id=179 op=LOAD Jan 19 09:26:34.233000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcd2de1d70 a2=94 a3=54428f items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.233000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.233000 audit: BPF prog-id=179 op=UNLOAD Jan 19 09:26:34.233000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcd2de1d70 a2=94 a3=54428f items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.233000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.233000 audit: BPF prog-id=180 op=LOAD Jan 19 09:26:34.233000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcd2de1da0 a2=94 a3=2 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.233000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.233000 audit: BPF prog-id=180 op=UNLOAD Jan 19 09:26:34.233000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcd2de1da0 a2=0 a3=2 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.233000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.237994 systemd[1]: Started cri-containerd-d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70.scope - libcontainer container d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70. Jan 19 09:26:34.258000 audit: BPF prog-id=181 op=LOAD Jan 19 09:26:34.259000 audit: BPF prog-id=182 op=LOAD Jan 19 09:26:34.259000 audit[4255]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4240 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.259000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439393365623437643830633332653330626236343934653530383863 Jan 19 09:26:34.259000 audit: BPF prog-id=182 op=UNLOAD Jan 19 09:26:34.259000 audit[4255]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4240 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.259000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439393365623437643830633332653330626236343934653530383863 Jan 19 09:26:34.260000 audit: BPF prog-id=183 op=LOAD Jan 19 09:26:34.260000 audit[4255]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4240 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.260000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439393365623437643830633332653330626236343934653530383863 Jan 19 09:26:34.260000 audit: BPF prog-id=184 op=LOAD Jan 19 09:26:34.260000 audit[4255]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4240 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.260000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439393365623437643830633332653330626236343934653530383863 Jan 19 09:26:34.260000 audit: BPF prog-id=184 op=UNLOAD Jan 19 09:26:34.260000 audit[4255]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4240 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.260000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439393365623437643830633332653330626236343934653530383863 Jan 19 09:26:34.260000 audit: BPF prog-id=183 op=UNLOAD Jan 19 09:26:34.260000 audit[4255]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4240 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.260000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439393365623437643830633332653330626236343934653530383863 Jan 19 09:26:34.261000 audit: BPF prog-id=185 op=LOAD Jan 19 09:26:34.261000 audit[4255]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4240 pid=4255 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.261000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6439393365623437643830633332653330626236343934653530383863 Jan 19 09:26:34.307873 containerd[1676]: time="2026-01-19T09:26:34.307808163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cb97b98cc-snlsk,Uid:61f2280f-1a00-4605-a568-ed8af0fdcb06,Namespace:calico-system,Attempt:0,} returns sandbox id \"d993eb47d80c32e30bb6494e5088cd1b8c175adfbe5cb1b0593149c31ac8ed70\"" Jan 19 09:26:34.310662 containerd[1676]: time="2026-01-19T09:26:34.310549785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 19 09:26:34.398000 audit: BPF prog-id=186 op=LOAD Jan 19 09:26:34.398000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffcd2de1c60 a2=94 a3=1 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.398000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.398000 audit: BPF prog-id=186 op=UNLOAD Jan 19 09:26:34.398000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffcd2de1c60 a2=94 a3=1 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.398000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.409000 audit: BPF prog-id=187 op=LOAD Jan 19 09:26:34.409000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcd2de1c50 a2=94 a3=4 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.409000 audit: BPF prog-id=187 op=UNLOAD Jan 19 09:26:34.409000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcd2de1c50 a2=0 a3=4 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.409000 audit: BPF prog-id=188 op=LOAD Jan 19 09:26:34.409000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffcd2de1ab0 a2=94 a3=5 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.409000 audit: BPF prog-id=188 op=UNLOAD Jan 19 09:26:34.409000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffcd2de1ab0 a2=0 a3=5 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.409000 audit: BPF prog-id=189 op=LOAD Jan 19 09:26:34.409000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcd2de1cd0 a2=94 a3=6 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.409000 audit: BPF prog-id=189 op=UNLOAD Jan 19 09:26:34.409000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffcd2de1cd0 a2=0 a3=6 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.409000 audit: BPF prog-id=190 op=LOAD Jan 19 09:26:34.409000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffcd2de1480 a2=94 a3=88 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.409000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.410000 audit: BPF prog-id=191 op=LOAD Jan 19 09:26:34.410000 audit[4273]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffcd2de1300 a2=94 a3=2 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.410000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.410000 audit: BPF prog-id=191 op=UNLOAD Jan 19 09:26:34.410000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffcd2de1330 a2=0 a3=7ffcd2de1430 items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.410000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.410000 audit: BPF prog-id=190 op=UNLOAD Jan 19 09:26:34.410000 audit[4273]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=2344fd10 a2=0 a3=55ed4d035295496e items=0 ppid=4115 pid=4273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.410000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 19 09:26:34.420000 audit: BPF prog-id=192 op=LOAD Jan 19 09:26:34.420000 audit[4291]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffde7023700 a2=98 a3=1999999999999999 items=0 ppid=4115 pid=4291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.420000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 19 09:26:34.421000 audit: BPF prog-id=192 op=UNLOAD Jan 19 09:26:34.421000 audit[4291]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffde70236d0 a3=0 items=0 ppid=4115 pid=4291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.421000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 19 09:26:34.421000 audit: BPF prog-id=193 op=LOAD Jan 19 09:26:34.421000 audit[4291]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffde70235e0 a2=94 a3=ffff items=0 ppid=4115 pid=4291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.421000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 19 09:26:34.421000 audit: BPF prog-id=193 op=UNLOAD Jan 19 09:26:34.421000 audit[4291]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffde70235e0 a2=94 a3=ffff items=0 ppid=4115 pid=4291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.421000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 19 09:26:34.421000 audit: BPF prog-id=194 op=LOAD Jan 19 09:26:34.421000 audit[4291]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffde7023620 a2=94 a3=7ffde7023800 items=0 ppid=4115 pid=4291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.421000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 19 09:26:34.421000 audit: BPF prog-id=194 op=UNLOAD Jan 19 09:26:34.421000 audit[4291]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffde7023620 a2=94 a3=7ffde7023800 items=0 ppid=4115 pid=4291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.421000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 19 09:26:34.513531 systemd-networkd[1554]: vxlan.calico: Link UP Jan 19 09:26:34.513633 systemd-networkd[1554]: vxlan.calico: Gained carrier Jan 19 09:26:34.541000 audit: BPF prog-id=195 op=LOAD Jan 19 09:26:34.541000 audit[4320]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff78f732f0 a2=98 a3=0 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.541000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.541000 audit: BPF prog-id=195 op=UNLOAD Jan 19 09:26:34.541000 audit[4320]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff78f732c0 a3=0 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.541000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.541000 audit: BPF prog-id=196 op=LOAD Jan 19 09:26:34.541000 audit[4320]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff78f73100 a2=94 a3=54428f items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.541000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.541000 audit: BPF prog-id=196 op=UNLOAD Jan 19 09:26:34.541000 audit[4320]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff78f73100 a2=94 a3=54428f items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.541000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.541000 audit: BPF prog-id=197 op=LOAD Jan 19 09:26:34.541000 audit[4320]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff78f73130 a2=94 a3=2 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.541000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.541000 audit: BPF prog-id=197 op=UNLOAD Jan 19 09:26:34.541000 audit[4320]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff78f73130 a2=0 a3=2 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.541000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.541000 audit: BPF prog-id=198 op=LOAD Jan 19 09:26:34.541000 audit[4320]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff78f72ee0 a2=94 a3=4 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.541000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.541000 audit: BPF prog-id=198 op=UNLOAD Jan 19 09:26:34.541000 audit[4320]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff78f72ee0 a2=94 a3=4 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.541000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.542000 audit: BPF prog-id=199 op=LOAD Jan 19 09:26:34.542000 audit[4320]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff78f72fe0 a2=94 a3=7fff78f73160 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.542000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.542000 audit: BPF prog-id=199 op=UNLOAD Jan 19 09:26:34.542000 audit[4320]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff78f72fe0 a2=0 a3=7fff78f73160 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.542000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.544000 audit: BPF prog-id=200 op=LOAD Jan 19 09:26:34.544000 audit[4320]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff78f72710 a2=94 a3=2 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.544000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.544000 audit: BPF prog-id=200 op=UNLOAD Jan 19 09:26:34.544000 audit[4320]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff78f72710 a2=0 a3=2 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.544000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.544000 audit: BPF prog-id=201 op=LOAD Jan 19 09:26:34.544000 audit[4320]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff78f72810 a2=94 a3=30 items=0 ppid=4115 pid=4320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.544000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 19 09:26:34.554000 audit: BPF prog-id=202 op=LOAD Jan 19 09:26:34.554000 audit[4322]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff0b8c9870 a2=98 a3=0 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.554000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.554000 audit: BPF prog-id=202 op=UNLOAD Jan 19 09:26:34.554000 audit[4322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff0b8c9840 a3=0 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.554000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.554000 audit: BPF prog-id=203 op=LOAD Jan 19 09:26:34.554000 audit[4322]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff0b8c9660 a2=94 a3=54428f items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.554000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.554000 audit: BPF prog-id=203 op=UNLOAD Jan 19 09:26:34.554000 audit[4322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff0b8c9660 a2=94 a3=54428f items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.554000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.554000 audit: BPF prog-id=204 op=LOAD Jan 19 09:26:34.554000 audit[4322]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff0b8c9690 a2=94 a3=2 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.554000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.554000 audit: BPF prog-id=204 op=UNLOAD Jan 19 09:26:34.554000 audit[4322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff0b8c9690 a2=0 a3=2 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.554000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.703000 audit: BPF prog-id=205 op=LOAD Jan 19 09:26:34.703000 audit[4322]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff0b8c9550 a2=94 a3=1 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.703000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.703000 audit: BPF prog-id=205 op=UNLOAD Jan 19 09:26:34.703000 audit[4322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff0b8c9550 a2=94 a3=1 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.703000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.713000 audit: BPF prog-id=206 op=LOAD Jan 19 09:26:34.713000 audit[4322]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff0b8c9540 a2=94 a3=4 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.713000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.713000 audit: BPF prog-id=206 op=UNLOAD Jan 19 09:26:34.713000 audit[4322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff0b8c9540 a2=0 a3=4 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.713000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.714000 audit: BPF prog-id=207 op=LOAD Jan 19 09:26:34.714000 audit[4322]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff0b8c93a0 a2=94 a3=5 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.714000 audit: BPF prog-id=207 op=UNLOAD Jan 19 09:26:34.714000 audit[4322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff0b8c93a0 a2=0 a3=5 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.714000 audit: BPF prog-id=208 op=LOAD Jan 19 09:26:34.714000 audit[4322]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff0b8c95c0 a2=94 a3=6 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.714000 audit: BPF prog-id=208 op=UNLOAD Jan 19 09:26:34.714000 audit[4322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff0b8c95c0 a2=0 a3=6 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.714000 audit: BPF prog-id=209 op=LOAD Jan 19 09:26:34.714000 audit[4322]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff0b8c8d70 a2=94 a3=88 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.714000 audit: BPF prog-id=210 op=LOAD Jan 19 09:26:34.714000 audit[4322]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7fff0b8c8bf0 a2=94 a3=2 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.714000 audit: BPF prog-id=210 op=UNLOAD Jan 19 09:26:34.714000 audit[4322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7fff0b8c8c20 a2=0 a3=7fff0b8c8d20 items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.715000 audit: BPF prog-id=209 op=UNLOAD Jan 19 09:26:34.715000 audit[4322]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=28240d10 a2=0 a3=e5cfa7bc82fdf42e items=0 ppid=4115 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.715000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 19 09:26:34.727000 audit: BPF prog-id=201 op=UNLOAD Jan 19 09:26:34.727000 audit[4115]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c000c28640 a2=0 a3=0 items=0 ppid=4097 pid=4115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.727000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 19 09:26:34.739162 containerd[1676]: time="2026-01-19T09:26:34.738943496Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:34.742938 containerd[1676]: time="2026-01-19T09:26:34.742888483Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 19 09:26:34.743027 containerd[1676]: time="2026-01-19T09:26:34.742989293Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:34.743243 kubelet[2926]: E0119 09:26:34.743204 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:26:34.744066 kubelet[2926]: E0119 09:26:34.744021 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:26:34.744163 kubelet[2926]: E0119 09:26:34.744137 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:34.746687 containerd[1676]: time="2026-01-19T09:26:34.746644338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 19 09:26:34.801000 audit[4346]: NETFILTER_CFG table=nat:117 family=2 entries=15 op=nft_register_chain pid=4346 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:34.801000 audit[4346]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffc7eb906e0 a2=0 a3=7ffc7eb906cc items=0 ppid=4115 pid=4346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.801000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:34.801000 audit[4347]: NETFILTER_CFG table=mangle:118 family=2 entries=16 op=nft_register_chain pid=4347 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:34.801000 audit[4347]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7fff4ef22880 a2=0 a3=564efc619000 items=0 ppid=4115 pid=4347 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.801000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:34.816000 audit[4345]: NETFILTER_CFG table=raw:119 family=2 entries=21 op=nft_register_chain pid=4345 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:34.816000 audit[4345]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7fff25f2dc80 a2=0 a3=7fff25f2dc6c items=0 ppid=4115 pid=4345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.816000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:34.817000 audit[4348]: NETFILTER_CFG table=filter:120 family=2 entries=94 op=nft_register_chain pid=4348 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:34.817000 audit[4348]: SYSCALL arch=c000003e syscall=46 success=yes exit=53116 a0=3 a1=7ffdbe8d0f90 a2=0 a3=559c01066000 items=0 ppid=4115 pid=4348 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:34.817000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:35.182492 containerd[1676]: time="2026-01-19T09:26:35.182246963Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:35.184921 containerd[1676]: time="2026-01-19T09:26:35.184737662Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 19 09:26:35.184921 containerd[1676]: time="2026-01-19T09:26:35.184867213Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:35.185308 kubelet[2926]: E0119 09:26:35.185219 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:26:35.185379 kubelet[2926]: E0119 09:26:35.185299 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:26:35.185539 kubelet[2926]: E0119 09:26:35.185461 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:35.185636 kubelet[2926]: E0119 09:26:35.185585 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:26:35.229297 containerd[1676]: time="2026-01-19T09:26:35.229062731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558fc744bc-cf5rv,Uid:37c84e8a-a62d-48e6-91f5-791aabccbba8,Namespace:calico-apiserver,Attempt:0,}" Jan 19 09:26:35.231807 containerd[1676]: time="2026-01-19T09:26:35.231692471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6bcfd67b5c-6xwgh,Uid:c1023597-3f97-435e-a787-6c49b8480f62,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:35.232696 containerd[1676]: time="2026-01-19T09:26:35.232639025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-67d5ff6dcb-sg9sb,Uid:4ce7f23f-45be-4728-a485-c7ae56af8a5c,Namespace:calico-apiserver,Attempt:0,}" Jan 19 09:26:35.405471 kubelet[2926]: E0119 09:26:35.405354 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:26:35.453944 systemd-networkd[1554]: calic06ddeebd03: Link UP Jan 19 09:26:35.461443 systemd-networkd[1554]: calic06ddeebd03: Gained carrier Jan 19 09:26:35.480000 audit[4417]: NETFILTER_CFG table=filter:121 family=2 entries=20 op=nft_register_rule pid=4417 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:35.480000 audit[4417]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffeae3f5660 a2=0 a3=7ffeae3f564c items=0 ppid=3032 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.480000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:35.485000 audit[4417]: NETFILTER_CFG table=nat:122 family=2 entries=14 op=nft_register_rule pid=4417 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:35.485000 audit[4417]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffeae3f5660 a2=0 a3=0 items=0 ppid=3032 pid=4417 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.485000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:35.497382 containerd[1676]: 2026-01-19 09:26:35.308 [INFO][4362] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0 calico-apiserver-558fc744bc- calico-apiserver 37c84e8a-a62d-48e6-91f5-791aabccbba8 864 0 2026-01-19 09:26:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:558fc744bc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4580-0-0-p-637e605ed7 calico-apiserver-558fc744bc-cf5rv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic06ddeebd03 [] [] }} ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Namespace="calico-apiserver" Pod="calico-apiserver-558fc744bc-cf5rv" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-" Jan 19 09:26:35.497382 containerd[1676]: 2026-01-19 09:26:35.308 [INFO][4362] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Namespace="calico-apiserver" Pod="calico-apiserver-558fc744bc-cf5rv" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" Jan 19 09:26:35.497382 containerd[1676]: 2026-01-19 09:26:35.346 [INFO][4392] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" HandleID="k8s-pod-network.817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" Jan 19 09:26:35.497755 containerd[1676]: 2026-01-19 09:26:35.346 [INFO][4392] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" HandleID="k8s-pod-network.817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4580-0-0-p-637e605ed7", "pod":"calico-apiserver-558fc744bc-cf5rv", "timestamp":"2026-01-19 09:26:35.346419827 +0000 UTC"}, Hostname:"ci-4580-0-0-p-637e605ed7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 19 09:26:35.497755 containerd[1676]: 2026-01-19 09:26:35.347 [INFO][4392] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 19 09:26:35.497755 containerd[1676]: 2026-01-19 09:26:35.347 [INFO][4392] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 19 09:26:35.497755 containerd[1676]: 2026-01-19 09:26:35.347 [INFO][4392] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-637e605ed7' Jan 19 09:26:35.497755 containerd[1676]: 2026-01-19 09:26:35.357 [INFO][4392] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.497755 containerd[1676]: 2026-01-19 09:26:35.366 [INFO][4392] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.497755 containerd[1676]: 2026-01-19 09:26:35.373 [INFO][4392] ipam/ipam.go 511: Trying affinity for 192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.497755 containerd[1676]: 2026-01-19 09:26:35.376 [INFO][4392] ipam/ipam.go 158: Attempting to load block cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.497755 containerd[1676]: 2026-01-19 09:26:35.380 [INFO][4392] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.497976 containerd[1676]: 2026-01-19 09:26:35.380 [INFO][4392] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.30.0/26 handle="k8s-pod-network.817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.497976 containerd[1676]: 2026-01-19 09:26:35.381 [INFO][4392] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449 Jan 19 09:26:35.497976 containerd[1676]: 2026-01-19 09:26:35.388 [INFO][4392] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.30.0/26 handle="k8s-pod-network.817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.497976 containerd[1676]: 2026-01-19 09:26:35.399 [INFO][4392] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.30.2/26] block=192.168.30.0/26 handle="k8s-pod-network.817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.497976 containerd[1676]: 2026-01-19 09:26:35.399 [INFO][4392] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.30.2/26] handle="k8s-pod-network.817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.497976 containerd[1676]: 2026-01-19 09:26:35.399 [INFO][4392] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 19 09:26:35.497976 containerd[1676]: 2026-01-19 09:26:35.400 [INFO][4392] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.30.2/26] IPv6=[] ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" HandleID="k8s-pod-network.817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" Jan 19 09:26:35.498138 containerd[1676]: 2026-01-19 09:26:35.419 [INFO][4362] cni-plugin/k8s.go 418: Populated endpoint ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Namespace="calico-apiserver" Pod="calico-apiserver-558fc744bc-cf5rv" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0", GenerateName:"calico-apiserver-558fc744bc-", Namespace:"calico-apiserver", SelfLink:"", UID:"37c84e8a-a62d-48e6-91f5-791aabccbba8", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"558fc744bc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"", Pod:"calico-apiserver-558fc744bc-cf5rv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic06ddeebd03", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:35.498204 containerd[1676]: 2026-01-19 09:26:35.422 [INFO][4362] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.2/32] ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Namespace="calico-apiserver" Pod="calico-apiserver-558fc744bc-cf5rv" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" Jan 19 09:26:35.498204 containerd[1676]: 2026-01-19 09:26:35.422 [INFO][4362] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic06ddeebd03 ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Namespace="calico-apiserver" Pod="calico-apiserver-558fc744bc-cf5rv" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" Jan 19 09:26:35.498204 containerd[1676]: 2026-01-19 09:26:35.475 [INFO][4362] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Namespace="calico-apiserver" Pod="calico-apiserver-558fc744bc-cf5rv" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" Jan 19 09:26:35.498363 containerd[1676]: 2026-01-19 09:26:35.476 [INFO][4362] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Namespace="calico-apiserver" Pod="calico-apiserver-558fc744bc-cf5rv" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0", GenerateName:"calico-apiserver-558fc744bc-", Namespace:"calico-apiserver", SelfLink:"", UID:"37c84e8a-a62d-48e6-91f5-791aabccbba8", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"558fc744bc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449", Pod:"calico-apiserver-558fc744bc-cf5rv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic06ddeebd03", MAC:"f6:84:c2:e5:da:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:35.498461 containerd[1676]: 2026-01-19 09:26:35.493 [INFO][4362] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" Namespace="calico-apiserver" Pod="calico-apiserver-558fc744bc-cf5rv" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--558fc744bc--cf5rv-eth0" Jan 19 09:26:35.522942 systemd-networkd[1554]: cali5c42309fb78: Gained IPv6LL Jan 19 09:26:35.542116 systemd-networkd[1554]: cali448c9e7d4cc: Link UP Jan 19 09:26:35.543071 systemd-networkd[1554]: cali448c9e7d4cc: Gained carrier Jan 19 09:26:35.569106 containerd[1676]: time="2026-01-19T09:26:35.569054712Z" level=info msg="connecting to shim 817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449" address="unix:///run/containerd/s/2791ecc123ebb8634b412eec5c07e2de646a7c83917f56a3ab49537f0f762950" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:35.575206 containerd[1676]: 2026-01-19 09:26:35.354 [INFO][4373] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0 calico-kube-controllers-6bcfd67b5c- calico-system c1023597-3f97-435e-a787-6c49b8480f62 859 0 2026-01-19 09:26:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6bcfd67b5c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4580-0-0-p-637e605ed7 calico-kube-controllers-6bcfd67b5c-6xwgh eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali448c9e7d4cc [] [] }} ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Namespace="calico-system" Pod="calico-kube-controllers-6bcfd67b5c-6xwgh" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-" Jan 19 09:26:35.575206 containerd[1676]: 2026-01-19 09:26:35.354 [INFO][4373] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Namespace="calico-system" Pod="calico-kube-controllers-6bcfd67b5c-6xwgh" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" Jan 19 09:26:35.575206 containerd[1676]: 2026-01-19 09:26:35.456 [INFO][4403] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" HandleID="k8s-pod-network.5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" Jan 19 09:26:35.575887 containerd[1676]: 2026-01-19 09:26:35.457 [INFO][4403] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" HandleID="k8s-pod-network.5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000334850), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-637e605ed7", "pod":"calico-kube-controllers-6bcfd67b5c-6xwgh", "timestamp":"2026-01-19 09:26:35.456916407 +0000 UTC"}, Hostname:"ci-4580-0-0-p-637e605ed7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 19 09:26:35.575887 containerd[1676]: 2026-01-19 09:26:35.457 [INFO][4403] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 19 09:26:35.575887 containerd[1676]: 2026-01-19 09:26:35.457 [INFO][4403] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 19 09:26:35.575887 containerd[1676]: 2026-01-19 09:26:35.457 [INFO][4403] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-637e605ed7' Jan 19 09:26:35.575887 containerd[1676]: 2026-01-19 09:26:35.476 [INFO][4403] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.575887 containerd[1676]: 2026-01-19 09:26:35.485 [INFO][4403] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.575887 containerd[1676]: 2026-01-19 09:26:35.491 [INFO][4403] ipam/ipam.go 511: Trying affinity for 192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.575887 containerd[1676]: 2026-01-19 09:26:35.494 [INFO][4403] ipam/ipam.go 158: Attempting to load block cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.575887 containerd[1676]: 2026-01-19 09:26:35.499 [INFO][4403] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.576147 containerd[1676]: 2026-01-19 09:26:35.500 [INFO][4403] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.30.0/26 handle="k8s-pod-network.5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.576147 containerd[1676]: 2026-01-19 09:26:35.524 [INFO][4403] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22 Jan 19 09:26:35.576147 containerd[1676]: 2026-01-19 09:26:35.528 [INFO][4403] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.30.0/26 handle="k8s-pod-network.5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.576147 containerd[1676]: 2026-01-19 09:26:35.533 [INFO][4403] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.30.3/26] block=192.168.30.0/26 handle="k8s-pod-network.5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.576147 containerd[1676]: 2026-01-19 09:26:35.534 [INFO][4403] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.30.3/26] handle="k8s-pod-network.5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.576147 containerd[1676]: 2026-01-19 09:26:35.534 [INFO][4403] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 19 09:26:35.576147 containerd[1676]: 2026-01-19 09:26:35.534 [INFO][4403] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.30.3/26] IPv6=[] ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" HandleID="k8s-pod-network.5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" Jan 19 09:26:35.576315 containerd[1676]: 2026-01-19 09:26:35.538 [INFO][4373] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Namespace="calico-system" Pod="calico-kube-controllers-6bcfd67b5c-6xwgh" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0", GenerateName:"calico-kube-controllers-6bcfd67b5c-", Namespace:"calico-system", SelfLink:"", UID:"c1023597-3f97-435e-a787-6c49b8480f62", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6bcfd67b5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"", Pod:"calico-kube-controllers-6bcfd67b5c-6xwgh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.30.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali448c9e7d4cc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:35.576412 containerd[1676]: 2026-01-19 09:26:35.538 [INFO][4373] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.3/32] ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Namespace="calico-system" Pod="calico-kube-controllers-6bcfd67b5c-6xwgh" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" Jan 19 09:26:35.576412 containerd[1676]: 2026-01-19 09:26:35.538 [INFO][4373] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali448c9e7d4cc ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Namespace="calico-system" Pod="calico-kube-controllers-6bcfd67b5c-6xwgh" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" Jan 19 09:26:35.576412 containerd[1676]: 2026-01-19 09:26:35.551 [INFO][4373] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Namespace="calico-system" Pod="calico-kube-controllers-6bcfd67b5c-6xwgh" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" Jan 19 09:26:35.577195 containerd[1676]: 2026-01-19 09:26:35.552 [INFO][4373] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Namespace="calico-system" Pod="calico-kube-controllers-6bcfd67b5c-6xwgh" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0", GenerateName:"calico-kube-controllers-6bcfd67b5c-", Namespace:"calico-system", SelfLink:"", UID:"c1023597-3f97-435e-a787-6c49b8480f62", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6bcfd67b5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22", Pod:"calico-kube-controllers-6bcfd67b5c-6xwgh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.30.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali448c9e7d4cc", MAC:"6e:08:43:ea:5c:49", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:35.577263 containerd[1676]: 2026-01-19 09:26:35.570 [INFO][4373] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" Namespace="calico-system" Pod="calico-kube-controllers-6bcfd67b5c-6xwgh" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--kube--controllers--6bcfd67b5c--6xwgh-eth0" Jan 19 09:26:35.578348 systemd-networkd[1554]: vxlan.calico: Gained IPv6LL Jan 19 09:26:35.609000 audit[4455]: NETFILTER_CFG table=filter:123 family=2 entries=50 op=nft_register_chain pid=4455 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:35.609000 audit[4455]: SYSCALL arch=c000003e syscall=46 success=yes exit=28208 a0=3 a1=7ffe49c3d0e0 a2=0 a3=7ffe49c3d0cc items=0 ppid=4115 pid=4455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.609000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:35.627833 containerd[1676]: time="2026-01-19T09:26:35.627496725Z" level=info msg="connecting to shim 5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22" address="unix:///run/containerd/s/98e75e2a575ff7f8520324a6abdfa646e3555395e681e55e8f3f60993eb23d51" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:35.662703 systemd-networkd[1554]: calic30fe21e864: Link UP Jan 19 09:26:35.663663 systemd[1]: Started cri-containerd-817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449.scope - libcontainer container 817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449. Jan 19 09:26:35.667693 systemd-networkd[1554]: calic30fe21e864: Gained carrier Jan 19 09:26:35.684000 audit[4490]: NETFILTER_CFG table=filter:124 family=2 entries=40 op=nft_register_chain pid=4490 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:35.684000 audit[4490]: SYSCALL arch=c000003e syscall=46 success=yes exit=20764 a0=3 a1=7fff16ffb8d0 a2=0 a3=7fff16ffb8bc items=0 ppid=4115 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.684000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:35.702000 audit: BPF prog-id=211 op=LOAD Jan 19 09:26:35.706000 audit: BPF prog-id=212 op=LOAD Jan 19 09:26:35.706000 audit[4454]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4436 pid=4454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831373538366166363239323639316163343361353736356665666464 Jan 19 09:26:35.708000 audit: BPF prog-id=212 op=UNLOAD Jan 19 09:26:35.708000 audit[4454]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4436 pid=4454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.708000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831373538366166363239323639316163343361353736356665666464 Jan 19 09:26:35.710181 containerd[1676]: 2026-01-19 09:26:35.364 [INFO][4382] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0 calico-apiserver-67d5ff6dcb- calico-apiserver 4ce7f23f-45be-4728-a485-c7ae56af8a5c 861 0 2026-01-19 09:26:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:67d5ff6dcb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4580-0-0-p-637e605ed7 calico-apiserver-67d5ff6dcb-sg9sb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic30fe21e864 [] [] }} ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-sg9sb" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-" Jan 19 09:26:35.710181 containerd[1676]: 2026-01-19 09:26:35.365 [INFO][4382] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-sg9sb" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" Jan 19 09:26:35.710181 containerd[1676]: 2026-01-19 09:26:35.476 [INFO][4408] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" HandleID="k8s-pod-network.24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" Jan 19 09:26:35.710362 containerd[1676]: 2026-01-19 09:26:35.476 [INFO][4408] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" HandleID="k8s-pod-network.24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fc60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4580-0-0-p-637e605ed7", "pod":"calico-apiserver-67d5ff6dcb-sg9sb", "timestamp":"2026-01-19 09:26:35.475768038 +0000 UTC"}, Hostname:"ci-4580-0-0-p-637e605ed7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 19 09:26:35.710362 containerd[1676]: 2026-01-19 09:26:35.476 [INFO][4408] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 19 09:26:35.710362 containerd[1676]: 2026-01-19 09:26:35.534 [INFO][4408] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 19 09:26:35.710362 containerd[1676]: 2026-01-19 09:26:35.534 [INFO][4408] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-637e605ed7' Jan 19 09:26:35.710362 containerd[1676]: 2026-01-19 09:26:35.580 [INFO][4408] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.710362 containerd[1676]: 2026-01-19 09:26:35.588 [INFO][4408] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.710362 containerd[1676]: 2026-01-19 09:26:35.602 [INFO][4408] ipam/ipam.go 511: Trying affinity for 192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.710362 containerd[1676]: 2026-01-19 09:26:35.607 [INFO][4408] ipam/ipam.go 158: Attempting to load block cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.710362 containerd[1676]: 2026-01-19 09:26:35.612 [INFO][4408] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.711688 containerd[1676]: 2026-01-19 09:26:35.613 [INFO][4408] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.30.0/26 handle="k8s-pod-network.24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.711688 containerd[1676]: 2026-01-19 09:26:35.616 [INFO][4408] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102 Jan 19 09:26:35.711688 containerd[1676]: 2026-01-19 09:26:35.627 [INFO][4408] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.30.0/26 handle="k8s-pod-network.24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.711688 containerd[1676]: 2026-01-19 09:26:35.641 [INFO][4408] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.30.4/26] block=192.168.30.0/26 handle="k8s-pod-network.24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.711688 containerd[1676]: 2026-01-19 09:26:35.641 [INFO][4408] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.30.4/26] handle="k8s-pod-network.24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:35.711688 containerd[1676]: 2026-01-19 09:26:35.641 [INFO][4408] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 19 09:26:35.711688 containerd[1676]: 2026-01-19 09:26:35.641 [INFO][4408] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.30.4/26] IPv6=[] ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" HandleID="k8s-pod-network.24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" Jan 19 09:26:35.711850 containerd[1676]: 2026-01-19 09:26:35.649 [INFO][4382] cni-plugin/k8s.go 418: Populated endpoint ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-sg9sb" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0", GenerateName:"calico-apiserver-67d5ff6dcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"4ce7f23f-45be-4728-a485-c7ae56af8a5c", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"67d5ff6dcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"", Pod:"calico-apiserver-67d5ff6dcb-sg9sb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic30fe21e864", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:35.711925 containerd[1676]: 2026-01-19 09:26:35.650 [INFO][4382] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.4/32] ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-sg9sb" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" Jan 19 09:26:35.711925 containerd[1676]: 2026-01-19 09:26:35.650 [INFO][4382] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic30fe21e864 ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-sg9sb" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" Jan 19 09:26:35.711925 containerd[1676]: 2026-01-19 09:26:35.674 [INFO][4382] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-sg9sb" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" Jan 19 09:26:35.711999 containerd[1676]: 2026-01-19 09:26:35.677 [INFO][4382] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-sg9sb" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0", GenerateName:"calico-apiserver-67d5ff6dcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"4ce7f23f-45be-4728-a485-c7ae56af8a5c", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"67d5ff6dcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102", Pod:"calico-apiserver-67d5ff6dcb-sg9sb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic30fe21e864", MAC:"e2:e5:1d:16:b9:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:35.712061 containerd[1676]: 2026-01-19 09:26:35.692 [INFO][4382] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-sg9sb" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--sg9sb-eth0" Jan 19 09:26:35.710000 audit: BPF prog-id=213 op=LOAD Jan 19 09:26:35.710000 audit[4454]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4436 pid=4454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.710000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831373538366166363239323639316163343361353736356665666464 Jan 19 09:26:35.714000 audit: BPF prog-id=214 op=LOAD Jan 19 09:26:35.714000 audit[4454]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4436 pid=4454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.714000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831373538366166363239323639316163343361353736356665666464 Jan 19 09:26:35.714000 audit: BPF prog-id=214 op=UNLOAD Jan 19 09:26:35.714000 audit[4454]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4436 pid=4454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.714000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831373538366166363239323639316163343361353736356665666464 Jan 19 09:26:35.714000 audit: BPF prog-id=213 op=UNLOAD Jan 19 09:26:35.714000 audit[4454]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4436 pid=4454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.714000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831373538366166363239323639316163343361353736356665666464 Jan 19 09:26:35.715000 audit: BPF prog-id=215 op=LOAD Jan 19 09:26:35.715000 audit[4454]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4436 pid=4454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.715000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831373538366166363239323639316163343361353736356665666464 Jan 19 09:26:35.719700 systemd[1]: Started cri-containerd-5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22.scope - libcontainer container 5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22. Jan 19 09:26:35.740000 audit: BPF prog-id=216 op=LOAD Jan 19 09:26:35.741000 audit: BPF prog-id=217 op=LOAD Jan 19 09:26:35.741000 audit[4489]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=4469 pid=4489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.741000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531383539323263636334383663613334353732663032343162653636 Jan 19 09:26:35.743000 audit: BPF prog-id=217 op=UNLOAD Jan 19 09:26:35.743000 audit[4489]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4469 pid=4489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531383539323263636334383663613334353732663032343162653636 Jan 19 09:26:35.743000 audit: BPF prog-id=218 op=LOAD Jan 19 09:26:35.743000 audit[4489]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=4469 pid=4489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531383539323263636334383663613334353732663032343162653636 Jan 19 09:26:35.743000 audit: BPF prog-id=219 op=LOAD Jan 19 09:26:35.743000 audit[4489]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=4469 pid=4489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531383539323263636334383663613334353732663032343162653636 Jan 19 09:26:35.743000 audit: BPF prog-id=219 op=UNLOAD Jan 19 09:26:35.743000 audit[4489]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4469 pid=4489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531383539323263636334383663613334353732663032343162653636 Jan 19 09:26:35.743000 audit: BPF prog-id=218 op=UNLOAD Jan 19 09:26:35.743000 audit[4489]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4469 pid=4489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531383539323263636334383663613334353732663032343162653636 Jan 19 09:26:35.743000 audit: BPF prog-id=220 op=LOAD Jan 19 09:26:35.743000 audit[4489]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=4469 pid=4489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531383539323263636334383663613334353732663032343162653636 Jan 19 09:26:35.749717 containerd[1676]: time="2026-01-19T09:26:35.749646928Z" level=info msg="connecting to shim 24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102" address="unix:///run/containerd/s/c6179b1a66d0848728812608d736eccbb8c7906ddb051921027252a7cb7c6033" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:35.777000 audit[4551]: NETFILTER_CFG table=filter:125 family=2 entries=51 op=nft_register_chain pid=4551 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:35.777000 audit[4551]: SYSCALL arch=c000003e syscall=46 success=yes exit=27116 a0=3 a1=7ffcb2014440 a2=0 a3=7ffcb201442c items=0 ppid=4115 pid=4551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.777000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:35.792440 containerd[1676]: time="2026-01-19T09:26:35.792340881Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558fc744bc-cf5rv,Uid:37c84e8a-a62d-48e6-91f5-791aabccbba8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"817586af6292691ac43a5765fefdd57fbf7e5a929938e4d1d246d892e4638449\"" Jan 19 09:26:35.793818 systemd[1]: Started cri-containerd-24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102.scope - libcontainer container 24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102. Jan 19 09:26:35.796480 containerd[1676]: time="2026-01-19T09:26:35.795898634Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:26:35.818000 audit: BPF prog-id=221 op=LOAD Jan 19 09:26:35.818000 audit: BPF prog-id=222 op=LOAD Jan 19 09:26:35.818000 audit[4543]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00018c238 a2=98 a3=0 items=0 ppid=4532 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.818000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234616438633066383530666234346633336431346265646262313233 Jan 19 09:26:35.819000 audit: BPF prog-id=222 op=UNLOAD Jan 19 09:26:35.819000 audit[4543]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4532 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.819000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234616438633066383530666234346633336431346265646262313233 Jan 19 09:26:35.819000 audit: BPF prog-id=223 op=LOAD Jan 19 09:26:35.819000 audit[4543]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00018c488 a2=98 a3=0 items=0 ppid=4532 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.819000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234616438633066383530666234346633336431346265646262313233 Jan 19 09:26:35.819000 audit: BPF prog-id=224 op=LOAD Jan 19 09:26:35.819000 audit[4543]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00018c218 a2=98 a3=0 items=0 ppid=4532 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.819000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234616438633066383530666234346633336431346265646262313233 Jan 19 09:26:35.819000 audit: BPF prog-id=224 op=UNLOAD Jan 19 09:26:35.819000 audit[4543]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4532 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.819000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234616438633066383530666234346633336431346265646262313233 Jan 19 09:26:35.819000 audit: BPF prog-id=223 op=UNLOAD Jan 19 09:26:35.819000 audit[4543]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4532 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.819000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234616438633066383530666234346633336431346265646262313233 Jan 19 09:26:35.819000 audit: BPF prog-id=225 op=LOAD Jan 19 09:26:35.819000 audit[4543]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00018c6e8 a2=98 a3=0 items=0 ppid=4532 pid=4543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:35.819000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234616438633066383530666234346633336431346265646262313233 Jan 19 09:26:35.859591 containerd[1676]: time="2026-01-19T09:26:35.859529006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6bcfd67b5c-6xwgh,Uid:c1023597-3f97-435e-a787-6c49b8480f62,Namespace:calico-system,Attempt:0,} returns sandbox id \"5185922ccc486ca34572f0241be665d8e1d9e4b8b5b036f0684525915d58ba22\"" Jan 19 09:26:35.883456 containerd[1676]: time="2026-01-19T09:26:35.883380467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-67d5ff6dcb-sg9sb,Uid:4ce7f23f-45be-4728-a485-c7ae56af8a5c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"24ad8c0f850fb44f33d14bedbb1233e1bddd13ec6b003b8ce2ce495ca23cc102\"" Jan 19 09:26:36.242804 containerd[1676]: time="2026-01-19T09:26:36.242582826Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:36.244467 containerd[1676]: time="2026-01-19T09:26:36.244325063Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:26:36.244541 containerd[1676]: time="2026-01-19T09:26:36.244450344Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:36.244907 kubelet[2926]: E0119 09:26:36.244844 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:36.245546 kubelet[2926]: E0119 09:26:36.245278 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:36.245729 kubelet[2926]: E0119 09:26:36.245562 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-558fc744bc-cf5rv_calico-apiserver(37c84e8a-a62d-48e6-91f5-791aabccbba8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:36.245729 kubelet[2926]: E0119 09:26:36.245649 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:26:36.246077 containerd[1676]: time="2026-01-19T09:26:36.246002169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 19 09:26:36.405696 kubelet[2926]: E0119 09:26:36.405561 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:26:36.498000 audit[4586]: NETFILTER_CFG table=filter:126 family=2 entries=20 op=nft_register_rule pid=4586 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:36.498000 audit[4586]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffc9dc31670 a2=0 a3=7ffc9dc3165c items=0 ppid=3032 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:36.498000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:36.503000 audit[4586]: NETFILTER_CFG table=nat:127 family=2 entries=14 op=nft_register_rule pid=4586 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:36.503000 audit[4586]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffc9dc31670 a2=0 a3=0 items=0 ppid=3032 pid=4586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:36.503000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:36.775131 containerd[1676]: time="2026-01-19T09:26:36.774980711Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:36.777814 containerd[1676]: time="2026-01-19T09:26:36.777716431Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 19 09:26:36.777984 containerd[1676]: time="2026-01-19T09:26:36.777719341Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:36.778273 kubelet[2926]: E0119 09:26:36.778061 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 19 09:26:36.778273 kubelet[2926]: E0119 09:26:36.778108 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 19 09:26:36.778332 kubelet[2926]: E0119 09:26:36.778299 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6bcfd67b5c-6xwgh_calico-system(c1023597-3f97-435e-a787-6c49b8480f62): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:36.778729 kubelet[2926]: E0119 09:26:36.778343 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:26:36.778814 containerd[1676]: time="2026-01-19T09:26:36.778525424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:26:37.051125 systemd-networkd[1554]: calic06ddeebd03: Gained IPv6LL Jan 19 09:26:37.198647 containerd[1676]: time="2026-01-19T09:26:37.198570598Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:37.199867 containerd[1676]: time="2026-01-19T09:26:37.199797852Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:26:37.200048 containerd[1676]: time="2026-01-19T09:26:37.199885342Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:37.200223 kubelet[2926]: E0119 09:26:37.200180 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:37.200365 kubelet[2926]: E0119 09:26:37.200326 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:37.200875 kubelet[2926]: E0119 09:26:37.200833 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-67d5ff6dcb-sg9sb_calico-apiserver(4ce7f23f-45be-4728-a485-c7ae56af8a5c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:37.200923 kubelet[2926]: E0119 09:26:37.200880 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:26:37.227536 containerd[1676]: time="2026-01-19T09:26:37.227411037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-l9tzs,Uid:d4703ae1-6e1e-4185-acf2-7e68ee8729a3,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:37.228261 containerd[1676]: time="2026-01-19T09:26:37.228226489Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-67d5ff6dcb-bnwh7,Uid:911306cd-bd01-4287-9129-4bbfd31bbd16,Namespace:calico-apiserver,Attempt:0,}" Jan 19 09:26:37.240604 systemd-networkd[1554]: calic30fe21e864: Gained IPv6LL Jan 19 09:26:37.368586 systemd-networkd[1554]: cali448c9e7d4cc: Gained IPv6LL Jan 19 09:26:37.379044 systemd-networkd[1554]: cali82c1e8311a7: Link UP Jan 19 09:26:37.380996 systemd-networkd[1554]: cali82c1e8311a7: Gained carrier Jan 19 09:26:37.397908 containerd[1676]: 2026-01-19 09:26:37.291 [INFO][4587] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0 goldmane-7c778bb748- calico-system d4703ae1-6e1e-4185-acf2-7e68ee8729a3 863 0 2026-01-19 09:26:10 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4580-0-0-p-637e605ed7 goldmane-7c778bb748-l9tzs eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali82c1e8311a7 [] [] }} ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Namespace="calico-system" Pod="goldmane-7c778bb748-l9tzs" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-" Jan 19 09:26:37.397908 containerd[1676]: 2026-01-19 09:26:37.291 [INFO][4587] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Namespace="calico-system" Pod="goldmane-7c778bb748-l9tzs" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" Jan 19 09:26:37.397908 containerd[1676]: 2026-01-19 09:26:37.331 [INFO][4610] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" HandleID="k8s-pod-network.4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Workload="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" Jan 19 09:26:37.398331 containerd[1676]: 2026-01-19 09:26:37.332 [INFO][4610] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" HandleID="k8s-pod-network.4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Workload="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00047e050), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-637e605ed7", "pod":"goldmane-7c778bb748-l9tzs", "timestamp":"2026-01-19 09:26:37.331973606 +0000 UTC"}, Hostname:"ci-4580-0-0-p-637e605ed7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 19 09:26:37.398331 containerd[1676]: 2026-01-19 09:26:37.332 [INFO][4610] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 19 09:26:37.398331 containerd[1676]: 2026-01-19 09:26:37.332 [INFO][4610] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 19 09:26:37.398331 containerd[1676]: 2026-01-19 09:26:37.332 [INFO][4610] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-637e605ed7' Jan 19 09:26:37.398331 containerd[1676]: 2026-01-19 09:26:37.339 [INFO][4610] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.398331 containerd[1676]: 2026-01-19 09:26:37.344 [INFO][4610] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.398331 containerd[1676]: 2026-01-19 09:26:37.349 [INFO][4610] ipam/ipam.go 511: Trying affinity for 192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.398331 containerd[1676]: 2026-01-19 09:26:37.351 [INFO][4610] ipam/ipam.go 158: Attempting to load block cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.398331 containerd[1676]: 2026-01-19 09:26:37.353 [INFO][4610] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.398513 containerd[1676]: 2026-01-19 09:26:37.353 [INFO][4610] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.30.0/26 handle="k8s-pod-network.4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.398513 containerd[1676]: 2026-01-19 09:26:37.355 [INFO][4610] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6 Jan 19 09:26:37.398513 containerd[1676]: 2026-01-19 09:26:37.360 [INFO][4610] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.30.0/26 handle="k8s-pod-network.4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.398513 containerd[1676]: 2026-01-19 09:26:37.367 [INFO][4610] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.30.5/26] block=192.168.30.0/26 handle="k8s-pod-network.4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.398513 containerd[1676]: 2026-01-19 09:26:37.368 [INFO][4610] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.30.5/26] handle="k8s-pod-network.4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.398513 containerd[1676]: 2026-01-19 09:26:37.368 [INFO][4610] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 19 09:26:37.398513 containerd[1676]: 2026-01-19 09:26:37.368 [INFO][4610] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.30.5/26] IPv6=[] ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" HandleID="k8s-pod-network.4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Workload="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" Jan 19 09:26:37.398627 containerd[1676]: 2026-01-19 09:26:37.371 [INFO][4587] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Namespace="calico-system" Pod="goldmane-7c778bb748-l9tzs" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"d4703ae1-6e1e-4185-acf2-7e68ee8729a3", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"", Pod:"goldmane-7c778bb748-l9tzs", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.30.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali82c1e8311a7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:37.398666 containerd[1676]: 2026-01-19 09:26:37.371 [INFO][4587] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.5/32] ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Namespace="calico-system" Pod="goldmane-7c778bb748-l9tzs" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" Jan 19 09:26:37.398666 containerd[1676]: 2026-01-19 09:26:37.371 [INFO][4587] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali82c1e8311a7 ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Namespace="calico-system" Pod="goldmane-7c778bb748-l9tzs" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" Jan 19 09:26:37.398666 containerd[1676]: 2026-01-19 09:26:37.382 [INFO][4587] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Namespace="calico-system" Pod="goldmane-7c778bb748-l9tzs" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" Jan 19 09:26:37.398735 containerd[1676]: 2026-01-19 09:26:37.383 [INFO][4587] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Namespace="calico-system" Pod="goldmane-7c778bb748-l9tzs" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"d4703ae1-6e1e-4185-acf2-7e68ee8729a3", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6", Pod:"goldmane-7c778bb748-l9tzs", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.30.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali82c1e8311a7", MAC:"76:99:80:1f:d4:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:37.398772 containerd[1676]: 2026-01-19 09:26:37.395 [INFO][4587] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" Namespace="calico-system" Pod="goldmane-7c778bb748-l9tzs" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-goldmane--7c778bb748--l9tzs-eth0" Jan 19 09:26:37.413659 kubelet[2926]: E0119 09:26:37.413486 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:26:37.414870 kubelet[2926]: E0119 09:26:37.414022 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:26:37.414870 kubelet[2926]: E0119 09:26:37.413581 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:26:37.443732 containerd[1676]: time="2026-01-19T09:26:37.443641220Z" level=info msg="connecting to shim 4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6" address="unix:///run/containerd/s/74f0dbaf864c359772bf2807ba1e2da765e7d3019333c0c0e5b0e3f716365709" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:37.491762 systemd[1]: Started cri-containerd-4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6.scope - libcontainer container 4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6. Jan 19 09:26:37.505000 audit[4663]: NETFILTER_CFG table=filter:128 family=2 entries=52 op=nft_register_chain pid=4663 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:37.509034 kernel: kauditd_printk_skb: 312 callbacks suppressed Jan 19 09:26:37.509142 kernel: audit: type=1325 audit(1768814797.505:691): table=filter:128 family=2 entries=52 op=nft_register_chain pid=4663 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:37.505000 audit[4663]: SYSCALL arch=c000003e syscall=46 success=yes exit=27540 a0=3 a1=7ffd860ec810 a2=0 a3=7ffd860ec7fc items=0 ppid=4115 pid=4663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.522466 kernel: audit: type=1300 audit(1768814797.505:691): arch=c000003e syscall=46 success=yes exit=27540 a0=3 a1=7ffd860ec810 a2=0 a3=7ffd860ec7fc items=0 ppid=4115 pid=4663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.505000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:37.529416 kernel: audit: type=1327 audit(1768814797.505:691): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:37.530032 systemd-networkd[1554]: cali6b8a1004ca9: Link UP Jan 19 09:26:37.531079 systemd-networkd[1554]: cali6b8a1004ca9: Gained carrier Jan 19 09:26:37.544475 kernel: audit: type=1334 audit(1768814797.539:692): prog-id=226 op=LOAD Jan 19 09:26:37.539000 audit: BPF prog-id=226 op=LOAD Jan 19 09:26:37.539000 audit: BPF prog-id=227 op=LOAD Jan 19 09:26:37.559826 kernel: audit: type=1334 audit(1768814797.539:693): prog-id=227 op=LOAD Jan 19 09:26:37.559936 kernel: audit: type=1300 audit(1768814797.539:693): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4641 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.539000 audit[4652]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4641 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.539000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435353964393233393562343966373161383439383564633533633733 Jan 19 09:26:37.560733 containerd[1676]: 2026-01-19 09:26:37.294 [INFO][4591] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0 calico-apiserver-67d5ff6dcb- calico-apiserver 911306cd-bd01-4287-9129-4bbfd31bbd16 862 0 2026-01-19 09:26:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:67d5ff6dcb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4580-0-0-p-637e605ed7 calico-apiserver-67d5ff6dcb-bnwh7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6b8a1004ca9 [] [] }} ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-bnwh7" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-" Jan 19 09:26:37.560733 containerd[1676]: 2026-01-19 09:26:37.295 [INFO][4591] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-bnwh7" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" Jan 19 09:26:37.560733 containerd[1676]: 2026-01-19 09:26:37.334 [INFO][4616] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" HandleID="k8s-pod-network.00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" Jan 19 09:26:37.560869 containerd[1676]: 2026-01-19 09:26:37.335 [INFO][4616] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" HandleID="k8s-pod-network.00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000323390), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4580-0-0-p-637e605ed7", "pod":"calico-apiserver-67d5ff6dcb-bnwh7", "timestamp":"2026-01-19 09:26:37.334818356 +0000 UTC"}, Hostname:"ci-4580-0-0-p-637e605ed7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 19 09:26:37.560869 containerd[1676]: 2026-01-19 09:26:37.335 [INFO][4616] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 19 09:26:37.560869 containerd[1676]: 2026-01-19 09:26:37.370 [INFO][4616] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 19 09:26:37.560869 containerd[1676]: 2026-01-19 09:26:37.371 [INFO][4616] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-637e605ed7' Jan 19 09:26:37.560869 containerd[1676]: 2026-01-19 09:26:37.442 [INFO][4616] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.560869 containerd[1676]: 2026-01-19 09:26:37.466 [INFO][4616] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.560869 containerd[1676]: 2026-01-19 09:26:37.483 [INFO][4616] ipam/ipam.go 511: Trying affinity for 192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.560869 containerd[1676]: 2026-01-19 09:26:37.487 [INFO][4616] ipam/ipam.go 158: Attempting to load block cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.560869 containerd[1676]: 2026-01-19 09:26:37.494 [INFO][4616] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.561012 containerd[1676]: 2026-01-19 09:26:37.494 [INFO][4616] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.30.0/26 handle="k8s-pod-network.00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.561012 containerd[1676]: 2026-01-19 09:26:37.497 [INFO][4616] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91 Jan 19 09:26:37.561012 containerd[1676]: 2026-01-19 09:26:37.501 [INFO][4616] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.30.0/26 handle="k8s-pod-network.00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.561012 containerd[1676]: 2026-01-19 09:26:37.513 [INFO][4616] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.30.6/26] block=192.168.30.0/26 handle="k8s-pod-network.00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.561012 containerd[1676]: 2026-01-19 09:26:37.513 [INFO][4616] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.30.6/26] handle="k8s-pod-network.00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:37.561012 containerd[1676]: 2026-01-19 09:26:37.513 [INFO][4616] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 19 09:26:37.561012 containerd[1676]: 2026-01-19 09:26:37.513 [INFO][4616] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.30.6/26] IPv6=[] ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" HandleID="k8s-pod-network.00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Workload="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" Jan 19 09:26:37.561116 containerd[1676]: 2026-01-19 09:26:37.524 [INFO][4591] cni-plugin/k8s.go 418: Populated endpoint ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-bnwh7" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0", GenerateName:"calico-apiserver-67d5ff6dcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"911306cd-bd01-4287-9129-4bbfd31bbd16", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"67d5ff6dcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"", Pod:"calico-apiserver-67d5ff6dcb-bnwh7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6b8a1004ca9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:37.561158 containerd[1676]: 2026-01-19 09:26:37.524 [INFO][4591] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.6/32] ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-bnwh7" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" Jan 19 09:26:37.561158 containerd[1676]: 2026-01-19 09:26:37.524 [INFO][4591] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6b8a1004ca9 ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-bnwh7" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" Jan 19 09:26:37.561158 containerd[1676]: 2026-01-19 09:26:37.532 [INFO][4591] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-bnwh7" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" Jan 19 09:26:37.561205 containerd[1676]: 2026-01-19 09:26:37.533 [INFO][4591] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-bnwh7" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0", GenerateName:"calico-apiserver-67d5ff6dcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"911306cd-bd01-4287-9129-4bbfd31bbd16", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"67d5ff6dcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91", Pod:"calico-apiserver-67d5ff6dcb-bnwh7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.30.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6b8a1004ca9", MAC:"92:04:69:22:b4:95", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:37.561246 containerd[1676]: 2026-01-19 09:26:37.549 [INFO][4591] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" Namespace="calico-apiserver" Pod="calico-apiserver-67d5ff6dcb-bnwh7" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-calico--apiserver--67d5ff6dcb--bnwh7-eth0" Jan 19 09:26:37.568432 kernel: audit: type=1327 audit(1768814797.539:693): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435353964393233393562343966373161383439383564633533633733 Jan 19 09:26:37.540000 audit: BPF prog-id=227 op=UNLOAD Jan 19 09:26:37.589474 kernel: audit: type=1334 audit(1768814797.540:694): prog-id=227 op=UNLOAD Jan 19 09:26:37.589559 kernel: audit: type=1300 audit(1768814797.540:694): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4641 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.589599 kernel: audit: type=1327 audit(1768814797.540:694): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435353964393233393562343966373161383439383564633533633733 Jan 19 09:26:37.540000 audit[4652]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4641 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435353964393233393562343966373161383439383564633533633733 Jan 19 09:26:37.540000 audit: BPF prog-id=228 op=LOAD Jan 19 09:26:37.540000 audit[4652]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4641 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435353964393233393562343966373161383439383564633533633733 Jan 19 09:26:37.540000 audit: BPF prog-id=229 op=LOAD Jan 19 09:26:37.540000 audit[4652]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4641 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435353964393233393562343966373161383439383564633533633733 Jan 19 09:26:37.540000 audit: BPF prog-id=229 op=UNLOAD Jan 19 09:26:37.540000 audit[4652]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4641 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435353964393233393562343966373161383439383564633533633733 Jan 19 09:26:37.540000 audit: BPF prog-id=228 op=UNLOAD Jan 19 09:26:37.540000 audit[4652]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4641 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435353964393233393562343966373161383439383564633533633733 Jan 19 09:26:37.540000 audit: BPF prog-id=230 op=LOAD Jan 19 09:26:37.540000 audit[4652]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4641 pid=4652 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3435353964393233393562343966373161383439383564633533633733 Jan 19 09:26:37.551000 audit[4675]: NETFILTER_CFG table=filter:129 family=2 entries=20 op=nft_register_rule pid=4675 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:37.551000 audit[4675]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffc5f2d1a40 a2=0 a3=7ffc5f2d1a2c items=0 ppid=3032 pid=4675 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.551000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:37.564000 audit[4675]: NETFILTER_CFG table=nat:130 family=2 entries=14 op=nft_register_rule pid=4675 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:37.564000 audit[4675]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffc5f2d1a40 a2=0 a3=0 items=0 ppid=3032 pid=4675 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.564000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:37.619000 audit[4687]: NETFILTER_CFG table=filter:131 family=2 entries=49 op=nft_register_chain pid=4687 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:37.619000 audit[4687]: SYSCALL arch=c000003e syscall=46 success=yes exit=25436 a0=3 a1=7ffcfb652d50 a2=0 a3=7ffcfb652d3c items=0 ppid=4115 pid=4687 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.619000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:37.626007 containerd[1676]: time="2026-01-19T09:26:37.625954228Z" level=info msg="connecting to shim 00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91" address="unix:///run/containerd/s/ea3654882ae259d4827cd57903460a53937671500d73e9fb1ed1b02c740da3ab" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:37.641715 containerd[1676]: time="2026-01-19T09:26:37.641660422Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-l9tzs,Uid:d4703ae1-6e1e-4185-acf2-7e68ee8729a3,Namespace:calico-system,Attempt:0,} returns sandbox id \"4559d92395b49f71a84985dc53c7372120afceaa5480d5cab00113638810d1f6\"" Jan 19 09:26:37.644431 containerd[1676]: time="2026-01-19T09:26:37.643292797Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 19 09:26:37.655605 systemd[1]: Started cri-containerd-00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91.scope - libcontainer container 00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91. Jan 19 09:26:37.667000 audit: BPF prog-id=231 op=LOAD Jan 19 09:26:37.667000 audit: BPF prog-id=232 op=LOAD Jan 19 09:26:37.667000 audit[4713]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4697 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.667000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363231323135616438356662363666373431663731656537633264 Jan 19 09:26:37.667000 audit: BPF prog-id=232 op=UNLOAD Jan 19 09:26:37.667000 audit[4713]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4697 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.667000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363231323135616438356662363666373431663731656537633264 Jan 19 09:26:37.667000 audit: BPF prog-id=233 op=LOAD Jan 19 09:26:37.667000 audit[4713]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4697 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.667000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363231323135616438356662363666373431663731656537633264 Jan 19 09:26:37.668000 audit: BPF prog-id=234 op=LOAD Jan 19 09:26:37.668000 audit[4713]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4697 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363231323135616438356662363666373431663731656537633264 Jan 19 09:26:37.668000 audit: BPF prog-id=234 op=UNLOAD Jan 19 09:26:37.668000 audit[4713]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4697 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363231323135616438356662363666373431663731656537633264 Jan 19 09:26:37.668000 audit: BPF prog-id=233 op=UNLOAD Jan 19 09:26:37.668000 audit[4713]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4697 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363231323135616438356662363666373431663731656537633264 Jan 19 09:26:37.668000 audit: BPF prog-id=235 op=LOAD Jan 19 09:26:37.668000 audit[4713]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4697 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:37.668000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030363231323135616438356662363666373431663731656537633264 Jan 19 09:26:37.705828 containerd[1676]: time="2026-01-19T09:26:37.705777952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-67d5ff6dcb-bnwh7,Uid:911306cd-bd01-4287-9129-4bbfd31bbd16,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"00621215ad85fb66f741f71ee7c2d0432e5ffcc197cadc1e3a3ff3195d52ab91\"" Jan 19 09:26:38.071800 containerd[1676]: time="2026-01-19T09:26:38.071731950Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:38.072862 containerd[1676]: time="2026-01-19T09:26:38.072803693Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 19 09:26:38.073037 containerd[1676]: time="2026-01-19T09:26:38.072912843Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:38.073446 kubelet[2926]: E0119 09:26:38.073133 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:26:38.073446 kubelet[2926]: E0119 09:26:38.073196 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:26:38.073522 kubelet[2926]: E0119 09:26:38.073428 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-l9tzs_calico-system(d4703ae1-6e1e-4185-acf2-7e68ee8729a3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:38.073522 kubelet[2926]: E0119 09:26:38.073503 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:26:38.073864 containerd[1676]: time="2026-01-19T09:26:38.073800647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:26:38.227032 containerd[1676]: time="2026-01-19T09:26:38.226719667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-bvxq9,Uid:a95d2224-19eb-41be-ac7f-5930c584a95b,Namespace:kube-system,Attempt:0,}" Jan 19 09:26:38.363928 systemd-networkd[1554]: calie182e811c04: Link UP Jan 19 09:26:38.365957 systemd-networkd[1554]: calie182e811c04: Gained carrier Jan 19 09:26:38.383303 containerd[1676]: 2026-01-19 09:26:38.299 [INFO][4740] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0 coredns-66bc5c9577- kube-system a95d2224-19eb-41be-ac7f-5930c584a95b 851 0 2026-01-19 09:25:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4580-0-0-p-637e605ed7 coredns-66bc5c9577-bvxq9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie182e811c04 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Namespace="kube-system" Pod="coredns-66bc5c9577-bvxq9" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-" Jan 19 09:26:38.383303 containerd[1676]: 2026-01-19 09:26:38.299 [INFO][4740] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Namespace="kube-system" Pod="coredns-66bc5c9577-bvxq9" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" Jan 19 09:26:38.383303 containerd[1676]: 2026-01-19 09:26:38.331 [INFO][4751] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" HandleID="k8s-pod-network.f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Workload="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" Jan 19 09:26:38.383888 containerd[1676]: 2026-01-19 09:26:38.331 [INFO][4751] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" HandleID="k8s-pod-network.f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Workload="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4580-0-0-p-637e605ed7", "pod":"coredns-66bc5c9577-bvxq9", "timestamp":"2026-01-19 09:26:38.33144065 +0000 UTC"}, Hostname:"ci-4580-0-0-p-637e605ed7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 19 09:26:38.383888 containerd[1676]: 2026-01-19 09:26:38.331 [INFO][4751] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 19 09:26:38.383888 containerd[1676]: 2026-01-19 09:26:38.331 [INFO][4751] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 19 09:26:38.383888 containerd[1676]: 2026-01-19 09:26:38.331 [INFO][4751] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-637e605ed7' Jan 19 09:26:38.383888 containerd[1676]: 2026-01-19 09:26:38.336 [INFO][4751] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:38.383888 containerd[1676]: 2026-01-19 09:26:38.340 [INFO][4751] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:38.383888 containerd[1676]: 2026-01-19 09:26:38.343 [INFO][4751] ipam/ipam.go 511: Trying affinity for 192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:38.383888 containerd[1676]: 2026-01-19 09:26:38.344 [INFO][4751] ipam/ipam.go 158: Attempting to load block cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:38.383888 containerd[1676]: 2026-01-19 09:26:38.346 [INFO][4751] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:38.384123 containerd[1676]: 2026-01-19 09:26:38.346 [INFO][4751] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.30.0/26 handle="k8s-pod-network.f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:38.384123 containerd[1676]: 2026-01-19 09:26:38.348 [INFO][4751] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836 Jan 19 09:26:38.384123 containerd[1676]: 2026-01-19 09:26:38.351 [INFO][4751] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.30.0/26 handle="k8s-pod-network.f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:38.384123 containerd[1676]: 2026-01-19 09:26:38.356 [INFO][4751] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.30.7/26] block=192.168.30.0/26 handle="k8s-pod-network.f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:38.384123 containerd[1676]: 2026-01-19 09:26:38.356 [INFO][4751] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.30.7/26] handle="k8s-pod-network.f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:38.384123 containerd[1676]: 2026-01-19 09:26:38.356 [INFO][4751] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 19 09:26:38.384123 containerd[1676]: 2026-01-19 09:26:38.356 [INFO][4751] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.30.7/26] IPv6=[] ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" HandleID="k8s-pod-network.f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Workload="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" Jan 19 09:26:38.384706 containerd[1676]: 2026-01-19 09:26:38.359 [INFO][4740] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Namespace="kube-system" Pod="coredns-66bc5c9577-bvxq9" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"a95d2224-19eb-41be-ac7f-5930c584a95b", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 25, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"", Pod:"coredns-66bc5c9577-bvxq9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.30.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie182e811c04", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:38.384706 containerd[1676]: 2026-01-19 09:26:38.360 [INFO][4740] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.7/32] ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Namespace="kube-system" Pod="coredns-66bc5c9577-bvxq9" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" Jan 19 09:26:38.384706 containerd[1676]: 2026-01-19 09:26:38.360 [INFO][4740] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie182e811c04 ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Namespace="kube-system" Pod="coredns-66bc5c9577-bvxq9" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" Jan 19 09:26:38.384706 containerd[1676]: 2026-01-19 09:26:38.366 [INFO][4740] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Namespace="kube-system" Pod="coredns-66bc5c9577-bvxq9" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" Jan 19 09:26:38.384706 containerd[1676]: 2026-01-19 09:26:38.367 [INFO][4740] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Namespace="kube-system" Pod="coredns-66bc5c9577-bvxq9" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"a95d2224-19eb-41be-ac7f-5930c584a95b", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 25, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836", Pod:"coredns-66bc5c9577-bvxq9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.30.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie182e811c04", MAC:"72:aa:a0:84:fd:3d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:38.385183 containerd[1676]: 2026-01-19 09:26:38.378 [INFO][4740] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" Namespace="kube-system" Pod="coredns-66bc5c9577-bvxq9" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--bvxq9-eth0" Jan 19 09:26:38.413682 kubelet[2926]: E0119 09:26:38.412556 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:26:38.408000 audit[4767]: NETFILTER_CFG table=filter:132 family=2 entries=58 op=nft_register_chain pid=4767 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:38.408000 audit[4767]: SYSCALL arch=c000003e syscall=46 success=yes exit=27288 a0=3 a1=7ffd8c51bdb0 a2=0 a3=7ffd8c51bd9c items=0 ppid=4115 pid=4767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.408000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:38.415046 containerd[1676]: time="2026-01-19T09:26:38.414885284Z" level=info msg="connecting to shim f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836" address="unix:///run/containerd/s/fbaf52c7ab9a681f6646cc274cae49795153a395b43d36b174f4a1c375aec8eb" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:38.458684 systemd[1]: Started cri-containerd-f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836.scope - libcontainer container f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836. Jan 19 09:26:38.472000 audit: BPF prog-id=236 op=LOAD Jan 19 09:26:38.473000 audit: BPF prog-id=237 op=LOAD Jan 19 09:26:38.473000 audit[4787]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4776 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.473000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632653561316533326566383339343831313461333964613333633532 Jan 19 09:26:38.473000 audit: BPF prog-id=237 op=UNLOAD Jan 19 09:26:38.473000 audit[4787]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4776 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.473000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632653561316533326566383339343831313461333964613333633532 Jan 19 09:26:38.473000 audit: BPF prog-id=238 op=LOAD Jan 19 09:26:38.473000 audit[4787]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4776 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.473000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632653561316533326566383339343831313461333964613333633532 Jan 19 09:26:38.473000 audit: BPF prog-id=239 op=LOAD Jan 19 09:26:38.473000 audit[4787]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4776 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.473000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632653561316533326566383339343831313461333964613333633532 Jan 19 09:26:38.473000 audit: BPF prog-id=239 op=UNLOAD Jan 19 09:26:38.473000 audit[4787]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4776 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.473000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632653561316533326566383339343831313461333964613333633532 Jan 19 09:26:38.474000 audit: BPF prog-id=238 op=UNLOAD Jan 19 09:26:38.474000 audit[4787]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4776 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.474000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632653561316533326566383339343831313461333964613333633532 Jan 19 09:26:38.474000 audit: BPF prog-id=240 op=LOAD Jan 19 09:26:38.474000 audit[4787]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4776 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.474000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632653561316533326566383339343831313461333964613333633532 Jan 19 09:26:38.514426 containerd[1676]: time="2026-01-19T09:26:38.514323150Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:38.515530 containerd[1676]: time="2026-01-19T09:26:38.515504504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-bvxq9,Uid:a95d2224-19eb-41be-ac7f-5930c584a95b,Namespace:kube-system,Attempt:0,} returns sandbox id \"f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836\"" Jan 19 09:26:38.517489 containerd[1676]: time="2026-01-19T09:26:38.517458810Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:26:38.517943 containerd[1676]: time="2026-01-19T09:26:38.517603990Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:38.518112 kubelet[2926]: E0119 09:26:38.518085 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:38.518316 kubelet[2926]: E0119 09:26:38.518177 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:38.518316 kubelet[2926]: E0119 09:26:38.518244 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-67d5ff6dcb-bnwh7_calico-apiserver(911306cd-bd01-4287-9129-4bbfd31bbd16): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:38.518316 kubelet[2926]: E0119 09:26:38.518275 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:26:38.520119 containerd[1676]: time="2026-01-19T09:26:38.519940088Z" level=info msg="CreateContainer within sandbox \"f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 19 09:26:38.521719 systemd-networkd[1554]: cali82c1e8311a7: Gained IPv6LL Jan 19 09:26:38.542503 containerd[1676]: time="2026-01-19T09:26:38.541609209Z" level=info msg="Container 7b5d28ba5707d9ae9e6449279756d8e169dfa5ca72780f9c073f4a59f96eb59a: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:26:38.548792 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2744396737.mount: Deactivated successfully. Jan 19 09:26:38.552797 containerd[1676]: time="2026-01-19T09:26:38.552723936Z" level=info msg="CreateContainer within sandbox \"f2e5a1e32ef83948114a39da33c52fb880512db8e209829b766f84ccd54e9836\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7b5d28ba5707d9ae9e6449279756d8e169dfa5ca72780f9c073f4a59f96eb59a\"" Jan 19 09:26:38.554978 containerd[1676]: time="2026-01-19T09:26:38.554938892Z" level=info msg="StartContainer for \"7b5d28ba5707d9ae9e6449279756d8e169dfa5ca72780f9c073f4a59f96eb59a\"" Jan 19 09:26:38.556281 containerd[1676]: time="2026-01-19T09:26:38.556232188Z" level=info msg="connecting to shim 7b5d28ba5707d9ae9e6449279756d8e169dfa5ca72780f9c073f4a59f96eb59a" address="unix:///run/containerd/s/fbaf52c7ab9a681f6646cc274cae49795153a395b43d36b174f4a1c375aec8eb" protocol=ttrpc version=3 Jan 19 09:26:38.590652 systemd[1]: Started cri-containerd-7b5d28ba5707d9ae9e6449279756d8e169dfa5ca72780f9c073f4a59f96eb59a.scope - libcontainer container 7b5d28ba5707d9ae9e6449279756d8e169dfa5ca72780f9c073f4a59f96eb59a. Jan 19 09:26:38.603000 audit[4831]: NETFILTER_CFG table=filter:133 family=2 entries=20 op=nft_register_rule pid=4831 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:38.603000 audit[4831]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcc772a570 a2=0 a3=7ffcc772a55c items=0 ppid=3032 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.603000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:38.606000 audit[4831]: NETFILTER_CFG table=nat:134 family=2 entries=14 op=nft_register_rule pid=4831 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:38.606000 audit[4831]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffcc772a570 a2=0 a3=0 items=0 ppid=3032 pid=4831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.606000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:38.609000 audit: BPF prog-id=241 op=LOAD Jan 19 09:26:38.609000 audit: BPF prog-id=242 op=LOAD Jan 19 09:26:38.609000 audit[4812]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=4776 pid=4812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.609000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762356432386261353730376439616539653634343932373937353664 Jan 19 09:26:38.609000 audit: BPF prog-id=242 op=UNLOAD Jan 19 09:26:38.609000 audit[4812]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4776 pid=4812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.609000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762356432386261353730376439616539653634343932373937353664 Jan 19 09:26:38.609000 audit: BPF prog-id=243 op=LOAD Jan 19 09:26:38.609000 audit[4812]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=4776 pid=4812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.609000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762356432386261353730376439616539653634343932373937353664 Jan 19 09:26:38.609000 audit: BPF prog-id=244 op=LOAD Jan 19 09:26:38.609000 audit[4812]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=4776 pid=4812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.609000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762356432386261353730376439616539653634343932373937353664 Jan 19 09:26:38.610000 audit: BPF prog-id=244 op=UNLOAD Jan 19 09:26:38.610000 audit[4812]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4776 pid=4812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.610000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762356432386261353730376439616539653634343932373937353664 Jan 19 09:26:38.610000 audit: BPF prog-id=243 op=UNLOAD Jan 19 09:26:38.610000 audit[4812]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4776 pid=4812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.610000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762356432386261353730376439616539653634343932373937353664 Jan 19 09:26:38.610000 audit: BPF prog-id=245 op=LOAD Jan 19 09:26:38.610000 audit[4812]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=4776 pid=4812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:38.610000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762356432386261353730376439616539653634343932373937353664 Jan 19 09:26:38.632192 containerd[1676]: time="2026-01-19T09:26:38.632079436Z" level=info msg="StartContainer for \"7b5d28ba5707d9ae9e6449279756d8e169dfa5ca72780f9c073f4a59f96eb59a\" returns successfully" Jan 19 09:26:39.227372 containerd[1676]: time="2026-01-19T09:26:39.227267352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-km4zx,Uid:cf97e1c1-8e02-43e7-988e-adabcb45ce04,Namespace:calico-system,Attempt:0,}" Jan 19 09:26:39.229776 containerd[1676]: time="2026-01-19T09:26:39.229637460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-fr6vw,Uid:c3af37da-419f-42f6-9360-491eb370d45d,Namespace:kube-system,Attempt:0,}" Jan 19 09:26:39.389262 systemd-networkd[1554]: cali28223f35f05: Link UP Jan 19 09:26:39.390186 systemd-networkd[1554]: cali28223f35f05: Gained carrier Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.333 [INFO][4844] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0 csi-node-driver- calico-system cf97e1c1-8e02-43e7-988e-adabcb45ce04 750 0 2026-01-19 09:26:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4580-0-0-p-637e605ed7 csi-node-driver-km4zx eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali28223f35f05 [] [] }} ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Namespace="calico-system" Pod="csi-node-driver-km4zx" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.333 [INFO][4844] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Namespace="calico-system" Pod="csi-node-driver-km4zx" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.354 [INFO][4870] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" HandleID="k8s-pod-network.25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Workload="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.355 [INFO][4870] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" HandleID="k8s-pod-network.25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Workload="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002b73a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-637e605ed7", "pod":"csi-node-driver-km4zx", "timestamp":"2026-01-19 09:26:39.35474735 +0000 UTC"}, Hostname:"ci-4580-0-0-p-637e605ed7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.355 [INFO][4870] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.355 [INFO][4870] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.355 [INFO][4870] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-637e605ed7' Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.362 [INFO][4870] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.367 [INFO][4870] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.370 [INFO][4870] ipam/ipam.go 511: Trying affinity for 192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.372 [INFO][4870] ipam/ipam.go 158: Attempting to load block cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.373 [INFO][4870] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.373 [INFO][4870] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.30.0/26 handle="k8s-pod-network.25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.374 [INFO][4870] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125 Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.377 [INFO][4870] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.30.0/26 handle="k8s-pod-network.25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.383 [INFO][4870] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.30.8/26] block=192.168.30.0/26 handle="k8s-pod-network.25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.383 [INFO][4870] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.30.8/26] handle="k8s-pod-network.25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.383 [INFO][4870] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 19 09:26:39.406113 containerd[1676]: 2026-01-19 09:26:39.383 [INFO][4870] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.30.8/26] IPv6=[] ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" HandleID="k8s-pod-network.25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Workload="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" Jan 19 09:26:39.407041 containerd[1676]: 2026-01-19 09:26:39.386 [INFO][4844] cni-plugin/k8s.go 418: Populated endpoint ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Namespace="calico-system" Pod="csi-node-driver-km4zx" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"cf97e1c1-8e02-43e7-988e-adabcb45ce04", ResourceVersion:"750", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"", Pod:"csi-node-driver-km4zx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.30.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali28223f35f05", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:39.407041 containerd[1676]: 2026-01-19 09:26:39.386 [INFO][4844] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.8/32] ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Namespace="calico-system" Pod="csi-node-driver-km4zx" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" Jan 19 09:26:39.407041 containerd[1676]: 2026-01-19 09:26:39.386 [INFO][4844] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali28223f35f05 ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Namespace="calico-system" Pod="csi-node-driver-km4zx" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" Jan 19 09:26:39.407041 containerd[1676]: 2026-01-19 09:26:39.391 [INFO][4844] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Namespace="calico-system" Pod="csi-node-driver-km4zx" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" Jan 19 09:26:39.407041 containerd[1676]: 2026-01-19 09:26:39.392 [INFO][4844] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Namespace="calico-system" Pod="csi-node-driver-km4zx" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"cf97e1c1-8e02-43e7-988e-adabcb45ce04", ResourceVersion:"750", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 26, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125", Pod:"csi-node-driver-km4zx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.30.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali28223f35f05", MAC:"3a:f8:0c:90:38:2e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:39.407041 containerd[1676]: 2026-01-19 09:26:39.399 [INFO][4844] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" Namespace="calico-system" Pod="csi-node-driver-km4zx" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-csi--node--driver--km4zx-eth0" Jan 19 09:26:39.419192 kubelet[2926]: E0119 09:26:39.419145 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:26:39.421987 kubelet[2926]: E0119 09:26:39.421938 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:26:39.426000 audit[4893]: NETFILTER_CFG table=filter:135 family=2 entries=56 op=nft_register_chain pid=4893 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:39.426000 audit[4893]: SYSCALL arch=c000003e syscall=46 success=yes exit=25500 a0=3 a1=7ffe58602df0 a2=0 a3=7ffe58602ddc items=0 ppid=4115 pid=4893 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.426000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:39.483283 kubelet[2926]: I0119 09:26:39.482568 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-bvxq9" podStartSLOduration=42.482538469 podStartE2EDuration="42.482538469s" podCreationTimestamp="2026-01-19 09:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-19 09:26:39.443898619 +0000 UTC m=+49.340717380" watchObservedRunningTime="2026-01-19 09:26:39.482538469 +0000 UTC m=+49.379357220" Jan 19 09:26:39.492077 containerd[1676]: time="2026-01-19T09:26:39.491976029Z" level=info msg="connecting to shim 25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125" address="unix:///run/containerd/s/aeefc89d0098315fd95e04397e123ccdead14796688aa176a44c708cb8d1dcf0" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:39.546080 systemd-networkd[1554]: cali6b8a1004ca9: Gained IPv6LL Jan 19 09:26:39.551923 systemd[1]: Started cri-containerd-25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125.scope - libcontainer container 25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125. Jan 19 09:26:39.560970 systemd-networkd[1554]: calif6638687f41: Link UP Jan 19 09:26:39.563294 systemd-networkd[1554]: calif6638687f41: Gained carrier Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.332 [INFO][4856] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0 coredns-66bc5c9577- kube-system c3af37da-419f-42f6-9360-491eb370d45d 858 0 2026-01-19 09:25:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4580-0-0-p-637e605ed7 coredns-66bc5c9577-fr6vw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif6638687f41 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Namespace="kube-system" Pod="coredns-66bc5c9577-fr6vw" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.333 [INFO][4856] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Namespace="kube-system" Pod="coredns-66bc5c9577-fr6vw" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.364 [INFO][4872] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" HandleID="k8s-pod-network.50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Workload="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.364 [INFO][4872] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" HandleID="k8s-pod-network.50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Workload="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4580-0-0-p-637e605ed7", "pod":"coredns-66bc5c9577-fr6vw", "timestamp":"2026-01-19 09:26:39.36421594 +0000 UTC"}, Hostname:"ci-4580-0-0-p-637e605ed7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.364 [INFO][4872] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.383 [INFO][4872] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.384 [INFO][4872] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-637e605ed7' Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.465 [INFO][4872] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.478 [INFO][4872] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.501 [INFO][4872] ipam/ipam.go 511: Trying affinity for 192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.509 [INFO][4872] ipam/ipam.go 158: Attempting to load block cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.517 [INFO][4872] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.30.0/26 host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.518 [INFO][4872] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.30.0/26 handle="k8s-pod-network.50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.522 [INFO][4872] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0 Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.529 [INFO][4872] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.30.0/26 handle="k8s-pod-network.50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.538 [INFO][4872] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.30.9/26] block=192.168.30.0/26 handle="k8s-pod-network.50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.539 [INFO][4872] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.30.9/26] handle="k8s-pod-network.50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" host="ci-4580-0-0-p-637e605ed7" Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.539 [INFO][4872] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 19 09:26:39.620549 containerd[1676]: 2026-01-19 09:26:39.539 [INFO][4872] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.30.9/26] IPv6=[] ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" HandleID="k8s-pod-network.50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Workload="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" Jan 19 09:26:39.621210 containerd[1676]: 2026-01-19 09:26:39.553 [INFO][4856] cni-plugin/k8s.go 418: Populated endpoint ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Namespace="kube-system" Pod="coredns-66bc5c9577-fr6vw" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"c3af37da-419f-42f6-9360-491eb370d45d", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 25, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"", Pod:"coredns-66bc5c9577-fr6vw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.30.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif6638687f41", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:39.621210 containerd[1676]: 2026-01-19 09:26:39.553 [INFO][4856] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.30.9/32] ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Namespace="kube-system" Pod="coredns-66bc5c9577-fr6vw" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" Jan 19 09:26:39.621210 containerd[1676]: 2026-01-19 09:26:39.553 [INFO][4856] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif6638687f41 ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Namespace="kube-system" Pod="coredns-66bc5c9577-fr6vw" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" Jan 19 09:26:39.621210 containerd[1676]: 2026-01-19 09:26:39.569 [INFO][4856] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Namespace="kube-system" Pod="coredns-66bc5c9577-fr6vw" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" Jan 19 09:26:39.621210 containerd[1676]: 2026-01-19 09:26:39.602 [INFO][4856] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Namespace="kube-system" Pod="coredns-66bc5c9577-fr6vw" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"c3af37da-419f-42f6-9360-491eb370d45d", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2026, time.January, 19, 9, 25, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-637e605ed7", ContainerID:"50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0", Pod:"coredns-66bc5c9577-fr6vw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.30.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif6638687f41", MAC:"e2:51:8c:3d:4b:02", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 19 09:26:39.622649 containerd[1676]: 2026-01-19 09:26:39.618 [INFO][4856] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" Namespace="kube-system" Pod="coredns-66bc5c9577-fr6vw" WorkloadEndpoint="ci--4580--0--0--p--637e605ed7-k8s-coredns--66bc5c9577--fr6vw-eth0" Jan 19 09:26:39.624000 audit: BPF prog-id=246 op=LOAD Jan 19 09:26:39.625000 audit: BPF prog-id=247 op=LOAD Jan 19 09:26:39.625000 audit[4916]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4903 pid=4916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.625000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235613939353464346235613262643537656239646634623933353933 Jan 19 09:26:39.626000 audit: BPF prog-id=247 op=UNLOAD Jan 19 09:26:39.626000 audit[4916]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4903 pid=4916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235613939353464346235613262643537656239646634623933353933 Jan 19 09:26:39.626000 audit: BPF prog-id=248 op=LOAD Jan 19 09:26:39.626000 audit[4916]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4903 pid=4916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235613939353464346235613262643537656239646634623933353933 Jan 19 09:26:39.626000 audit: BPF prog-id=249 op=LOAD Jan 19 09:26:39.626000 audit[4916]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4903 pid=4916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235613939353464346235613262643537656239646634623933353933 Jan 19 09:26:39.626000 audit: BPF prog-id=249 op=UNLOAD Jan 19 09:26:39.626000 audit[4916]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4903 pid=4916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235613939353464346235613262643537656239646634623933353933 Jan 19 09:26:39.626000 audit: BPF prog-id=248 op=UNLOAD Jan 19 09:26:39.626000 audit[4916]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4903 pid=4916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235613939353464346235613262643537656239646634623933353933 Jan 19 09:26:39.626000 audit: BPF prog-id=250 op=LOAD Jan 19 09:26:39.626000 audit[4916]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4903 pid=4916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.626000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235613939353464346235613262643537656239646634623933353933 Jan 19 09:26:39.666000 audit[4946]: NETFILTER_CFG table=filter:136 family=2 entries=17 op=nft_register_rule pid=4946 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:39.666000 audit[4946]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffca3a19ea0 a2=0 a3=7ffca3a19e8c items=0 ppid=3032 pid=4946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.666000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:39.676000 audit[4946]: NETFILTER_CFG table=nat:137 family=2 entries=35 op=nft_register_chain pid=4946 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:39.676000 audit[4946]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffca3a19ea0 a2=0 a3=7ffca3a19e8c items=0 ppid=3032 pid=4946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.676000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:39.684776 containerd[1676]: time="2026-01-19T09:26:39.684659441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-km4zx,Uid:cf97e1c1-8e02-43e7-988e-adabcb45ce04,Namespace:calico-system,Attempt:0,} returns sandbox id \"25a9954d4b5a2bd57eb9df4b9359397cda492380b380d998292896adcf88d125\"" Jan 19 09:26:39.693562 containerd[1676]: time="2026-01-19T09:26:39.693526538Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 19 09:26:39.697089 containerd[1676]: time="2026-01-19T09:26:39.697052749Z" level=info msg="connecting to shim 50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0" address="unix:///run/containerd/s/4fe639d04a23c6b0515cb819b67a930f7ffd89fe92bf75a17e44323b1e215d99" namespace=k8s.io protocol=ttrpc version=3 Jan 19 09:26:39.701000 audit[4962]: NETFILTER_CFG table=filter:138 family=2 entries=56 op=nft_register_chain pid=4962 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 19 09:26:39.701000 audit[4962]: SYSCALL arch=c000003e syscall=46 success=yes exit=25080 a0=3 a1=7ffd526746a0 a2=0 a3=7ffd5267468c items=0 ppid=4115 pid=4962 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.701000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 19 09:26:39.744593 systemd[1]: Started cri-containerd-50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0.scope - libcontainer container 50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0. Jan 19 09:26:39.756000 audit: BPF prog-id=251 op=LOAD Jan 19 09:26:39.757000 audit: BPF prog-id=252 op=LOAD Jan 19 09:26:39.757000 audit[4974]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=4961 pid=4974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530633462353566373232346462326332343832386432306437656239 Jan 19 09:26:39.757000 audit: BPF prog-id=252 op=UNLOAD Jan 19 09:26:39.757000 audit[4974]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4961 pid=4974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530633462353566373232346462326332343832386432306437656239 Jan 19 09:26:39.757000 audit: BPF prog-id=253 op=LOAD Jan 19 09:26:39.757000 audit[4974]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=4961 pid=4974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530633462353566373232346462326332343832386432306437656239 Jan 19 09:26:39.757000 audit: BPF prog-id=254 op=LOAD Jan 19 09:26:39.757000 audit[4974]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=4961 pid=4974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530633462353566373232346462326332343832386432306437656239 Jan 19 09:26:39.757000 audit: BPF prog-id=254 op=UNLOAD Jan 19 09:26:39.757000 audit[4974]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4961 pid=4974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530633462353566373232346462326332343832386432306437656239 Jan 19 09:26:39.757000 audit: BPF prog-id=253 op=UNLOAD Jan 19 09:26:39.757000 audit[4974]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4961 pid=4974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530633462353566373232346462326332343832386432306437656239 Jan 19 09:26:39.757000 audit: BPF prog-id=255 op=LOAD Jan 19 09:26:39.757000 audit[4974]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=4961 pid=4974 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.757000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530633462353566373232346462326332343832386432306437656239 Jan 19 09:26:39.794254 containerd[1676]: time="2026-01-19T09:26:39.794174133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-fr6vw,Uid:c3af37da-419f-42f6-9360-491eb370d45d,Namespace:kube-system,Attempt:0,} returns sandbox id \"50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0\"" Jan 19 09:26:39.799250 containerd[1676]: time="2026-01-19T09:26:39.799214939Z" level=info msg="CreateContainer within sandbox \"50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 19 09:26:39.807767 containerd[1676]: time="2026-01-19T09:26:39.807734795Z" level=info msg="Container 7a7c51125caf431b6b78348c6ebce428929c25480d3ebbcaccb9805c693f2513: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:26:39.813192 containerd[1676]: time="2026-01-19T09:26:39.813161852Z" level=info msg="CreateContainer within sandbox \"50c4b55f7224db2c24828d20d7eb973d32fda389794a6a20c7ac326258732db0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7a7c51125caf431b6b78348c6ebce428929c25480d3ebbcaccb9805c693f2513\"" Jan 19 09:26:39.814245 containerd[1676]: time="2026-01-19T09:26:39.814207296Z" level=info msg="StartContainer for \"7a7c51125caf431b6b78348c6ebce428929c25480d3ebbcaccb9805c693f2513\"" Jan 19 09:26:39.815055 containerd[1676]: time="2026-01-19T09:26:39.814997268Z" level=info msg="connecting to shim 7a7c51125caf431b6b78348c6ebce428929c25480d3ebbcaccb9805c693f2513" address="unix:///run/containerd/s/4fe639d04a23c6b0515cb819b67a930f7ffd89fe92bf75a17e44323b1e215d99" protocol=ttrpc version=3 Jan 19 09:26:39.849587 systemd[1]: Started cri-containerd-7a7c51125caf431b6b78348c6ebce428929c25480d3ebbcaccb9805c693f2513.scope - libcontainer container 7a7c51125caf431b6b78348c6ebce428929c25480d3ebbcaccb9805c693f2513. Jan 19 09:26:39.867000 audit: BPF prog-id=256 op=LOAD Jan 19 09:26:39.868000 audit: BPF prog-id=257 op=LOAD Jan 19 09:26:39.868000 audit[5002]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4961 pid=5002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.868000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376335313132356361663433316236623738333438633665626365 Jan 19 09:26:39.868000 audit: BPF prog-id=257 op=UNLOAD Jan 19 09:26:39.868000 audit[5002]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4961 pid=5002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.868000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376335313132356361663433316236623738333438633665626365 Jan 19 09:26:39.869000 audit: BPF prog-id=258 op=LOAD Jan 19 09:26:39.869000 audit[5002]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4961 pid=5002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.869000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376335313132356361663433316236623738333438633665626365 Jan 19 09:26:39.869000 audit: BPF prog-id=259 op=LOAD Jan 19 09:26:39.869000 audit[5002]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4961 pid=5002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.869000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376335313132356361663433316236623738333438633665626365 Jan 19 09:26:39.870000 audit: BPF prog-id=259 op=UNLOAD Jan 19 09:26:39.870000 audit[5002]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4961 pid=5002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376335313132356361663433316236623738333438633665626365 Jan 19 09:26:39.870000 audit: BPF prog-id=258 op=UNLOAD Jan 19 09:26:39.870000 audit[5002]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4961 pid=5002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376335313132356361663433316236623738333438633665626365 Jan 19 09:26:39.870000 audit: BPF prog-id=260 op=LOAD Jan 19 09:26:39.870000 audit[5002]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4961 pid=5002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:39.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761376335313132356361663433316236623738333438633665626365 Jan 19 09:26:39.893310 containerd[1676]: time="2026-01-19T09:26:39.893263632Z" level=info msg="StartContainer for \"7a7c51125caf431b6b78348c6ebce428929c25480d3ebbcaccb9805c693f2513\" returns successfully" Jan 19 09:26:40.118832 containerd[1676]: time="2026-01-19T09:26:40.118673660Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:40.121075 containerd[1676]: time="2026-01-19T09:26:40.121030407Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 19 09:26:40.121335 containerd[1676]: time="2026-01-19T09:26:40.121176847Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:40.121576 kubelet[2926]: E0119 09:26:40.121487 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 19 09:26:40.121576 kubelet[2926]: E0119 09:26:40.121557 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 19 09:26:40.121839 kubelet[2926]: E0119 09:26:40.121785 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:40.123302 containerd[1676]: time="2026-01-19T09:26:40.123281314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 19 09:26:40.186082 systemd-networkd[1554]: calie182e811c04: Gained IPv6LL Jan 19 09:26:40.433790 kubelet[2926]: I0119 09:26:40.433300 2926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-fr6vw" podStartSLOduration=43.433286308 podStartE2EDuration="43.433286308s" podCreationTimestamp="2026-01-19 09:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-19 09:26:40.432786246 +0000 UTC m=+50.329604997" watchObservedRunningTime="2026-01-19 09:26:40.433286308 +0000 UTC m=+50.330105069" Jan 19 09:26:40.566312 containerd[1676]: time="2026-01-19T09:26:40.566248634Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:40.567602 containerd[1676]: time="2026-01-19T09:26:40.567540198Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 19 09:26:40.567749 containerd[1676]: time="2026-01-19T09:26:40.567617228Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:40.567909 kubelet[2926]: E0119 09:26:40.567843 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 19 09:26:40.567909 kubelet[2926]: E0119 09:26:40.567887 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 19 09:26:40.568329 kubelet[2926]: E0119 09:26:40.567985 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:40.568406 kubelet[2926]: E0119 09:26:40.568342 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:40.705000 audit[5039]: NETFILTER_CFG table=filter:139 family=2 entries=14 op=nft_register_rule pid=5039 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:40.705000 audit[5039]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe9320a400 a2=0 a3=7ffe9320a3ec items=0 ppid=3032 pid=5039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:40.705000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:40.723000 audit[5039]: NETFILTER_CFG table=nat:140 family=2 entries=56 op=nft_register_chain pid=5039 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:26:40.723000 audit[5039]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffe9320a400 a2=0 a3=7ffe9320a3ec items=0 ppid=3032 pid=5039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:26:40.723000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:26:41.018328 systemd-networkd[1554]: calif6638687f41: Gained IPv6LL Jan 19 09:26:41.209112 systemd-networkd[1554]: cali28223f35f05: Gained IPv6LL Jan 19 09:26:41.431909 kubelet[2926]: E0119 09:26:41.431766 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:46.232017 containerd[1676]: time="2026-01-19T09:26:46.231901152Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 19 09:26:46.667885 containerd[1676]: time="2026-01-19T09:26:46.667807681Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:46.669524 containerd[1676]: time="2026-01-19T09:26:46.669340414Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 19 09:26:46.669524 containerd[1676]: time="2026-01-19T09:26:46.669415774Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:46.669941 kubelet[2926]: E0119 09:26:46.669870 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:26:46.669941 kubelet[2926]: E0119 09:26:46.669930 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:26:46.670576 kubelet[2926]: E0119 09:26:46.670025 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:46.671543 containerd[1676]: time="2026-01-19T09:26:46.671311359Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 19 09:26:47.104427 containerd[1676]: time="2026-01-19T09:26:47.104267632Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:47.106219 containerd[1676]: time="2026-01-19T09:26:47.106066025Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 19 09:26:47.106219 containerd[1676]: time="2026-01-19T09:26:47.106121446Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:47.106454 kubelet[2926]: E0119 09:26:47.106356 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:26:47.106454 kubelet[2926]: E0119 09:26:47.106409 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:26:47.106577 kubelet[2926]: E0119 09:26:47.106470 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:47.106577 kubelet[2926]: E0119 09:26:47.106502 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:26:48.227413 containerd[1676]: time="2026-01-19T09:26:48.227278863Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:26:48.685009 containerd[1676]: time="2026-01-19T09:26:48.684940320Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:48.686361 containerd[1676]: time="2026-01-19T09:26:48.686300174Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:26:48.686451 containerd[1676]: time="2026-01-19T09:26:48.686413174Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:48.686661 kubelet[2926]: E0119 09:26:48.686612 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:48.687604 kubelet[2926]: E0119 09:26:48.686669 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:48.687604 kubelet[2926]: E0119 09:26:48.686776 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-558fc744bc-cf5rv_calico-apiserver(37c84e8a-a62d-48e6-91f5-791aabccbba8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:48.687604 kubelet[2926]: E0119 09:26:48.686823 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:26:50.229856 containerd[1676]: time="2026-01-19T09:26:50.229779032Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 19 09:26:50.707230 containerd[1676]: time="2026-01-19T09:26:50.707170677Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:50.712093 containerd[1676]: time="2026-01-19T09:26:50.711931397Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 19 09:26:50.712093 containerd[1676]: time="2026-01-19T09:26:50.712037787Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:50.712645 kubelet[2926]: E0119 09:26:50.712575 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 19 09:26:50.713273 kubelet[2926]: E0119 09:26:50.712647 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 19 09:26:50.713273 kubelet[2926]: E0119 09:26:50.713107 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6bcfd67b5c-6xwgh_calico-system(c1023597-3f97-435e-a787-6c49b8480f62): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:50.713487 kubelet[2926]: E0119 09:26:50.713346 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:26:51.226702 containerd[1676]: time="2026-01-19T09:26:51.226626369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:26:51.653425 containerd[1676]: time="2026-01-19T09:26:51.653232295Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:51.659937 containerd[1676]: time="2026-01-19T09:26:51.659826528Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:26:51.660367 containerd[1676]: time="2026-01-19T09:26:51.659955328Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:51.660579 kubelet[2926]: E0119 09:26:51.660236 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:51.660579 kubelet[2926]: E0119 09:26:51.660305 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:51.662262 kubelet[2926]: E0119 09:26:51.660497 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-67d5ff6dcb-sg9sb_calico-apiserver(4ce7f23f-45be-4728-a485-c7ae56af8a5c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:51.662981 kubelet[2926]: E0119 09:26:51.662273 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:26:54.225790 containerd[1676]: time="2026-01-19T09:26:54.225682238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:26:54.677038 containerd[1676]: time="2026-01-19T09:26:54.676977693Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:54.679264 containerd[1676]: time="2026-01-19T09:26:54.679144390Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:26:54.679486 containerd[1676]: time="2026-01-19T09:26:54.679158799Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:54.679674 kubelet[2926]: E0119 09:26:54.679636 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:54.680195 kubelet[2926]: E0119 09:26:54.680112 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:26:54.680475 kubelet[2926]: E0119 09:26:54.680220 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-67d5ff6dcb-bnwh7_calico-apiserver(911306cd-bd01-4287-9129-4bbfd31bbd16): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:54.680475 kubelet[2926]: E0119 09:26:54.680256 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:26:55.225878 containerd[1676]: time="2026-01-19T09:26:55.225812621Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 19 09:26:55.658962 containerd[1676]: time="2026-01-19T09:26:55.658896159Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:55.661640 containerd[1676]: time="2026-01-19T09:26:55.661581719Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 19 09:26:55.661931 containerd[1676]: time="2026-01-19T09:26:55.661724364Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:55.662171 kubelet[2926]: E0119 09:26:55.662101 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:26:55.662309 kubelet[2926]: E0119 09:26:55.662174 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:26:55.662309 kubelet[2926]: E0119 09:26:55.662268 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-l9tzs_calico-system(d4703ae1-6e1e-4185-acf2-7e68ee8729a3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:55.662459 kubelet[2926]: E0119 09:26:55.662310 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:26:56.227844 containerd[1676]: time="2026-01-19T09:26:56.227794649Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 19 09:26:56.665490 containerd[1676]: time="2026-01-19T09:26:56.665421457Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:56.666481 containerd[1676]: time="2026-01-19T09:26:56.666438480Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 19 09:26:56.667088 containerd[1676]: time="2026-01-19T09:26:56.666529937Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:56.667145 kubelet[2926]: E0119 09:26:56.666741 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 19 09:26:56.667145 kubelet[2926]: E0119 09:26:56.666790 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 19 09:26:56.667145 kubelet[2926]: E0119 09:26:56.666879 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:56.668531 containerd[1676]: time="2026-01-19T09:26:56.668494275Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 19 09:26:57.125442 containerd[1676]: time="2026-01-19T09:26:57.125346862Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:26:57.127465 containerd[1676]: time="2026-01-19T09:26:57.127422149Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 19 09:26:57.127583 containerd[1676]: time="2026-01-19T09:26:57.127494976Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 19 09:26:57.127705 kubelet[2926]: E0119 09:26:57.127658 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 19 09:26:57.127798 kubelet[2926]: E0119 09:26:57.127763 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 19 09:26:57.127858 kubelet[2926]: E0119 09:26:57.127849 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 19 09:26:57.127928 kubelet[2926]: E0119 09:26:57.127882 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:26:59.228365 kubelet[2926]: E0119 09:26:59.227860 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:27:00.226415 kubelet[2926]: E0119 09:27:00.225732 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:27:04.229274 kubelet[2926]: E0119 09:27:04.228597 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:27:04.229274 kubelet[2926]: E0119 09:27:04.229188 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:27:06.227419 kubelet[2926]: E0119 09:27:06.226701 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:27:07.225017 kubelet[2926]: E0119 09:27:07.224946 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:27:10.235057 kubelet[2926]: E0119 09:27:10.233736 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:27:12.225959 containerd[1676]: time="2026-01-19T09:27:12.225917486Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 19 09:27:12.663326 containerd[1676]: time="2026-01-19T09:27:12.663115596Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:27:12.664365 containerd[1676]: time="2026-01-19T09:27:12.664280399Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 19 09:27:12.664365 containerd[1676]: time="2026-01-19T09:27:12.664337998Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 19 09:27:12.666460 kubelet[2926]: E0119 09:27:12.666418 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:27:12.666827 kubelet[2926]: E0119 09:27:12.666465 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:27:12.666827 kubelet[2926]: E0119 09:27:12.666529 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 19 09:27:12.667535 containerd[1676]: time="2026-01-19T09:27:12.667513544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 19 09:27:13.130839 containerd[1676]: time="2026-01-19T09:27:13.130795016Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:27:13.137649 containerd[1676]: time="2026-01-19T09:27:13.137495955Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 19 09:27:13.137649 containerd[1676]: time="2026-01-19T09:27:13.137493325Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 19 09:27:13.138302 kubelet[2926]: E0119 09:27:13.138214 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:27:13.138984 kubelet[2926]: E0119 09:27:13.138744 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:27:13.138984 kubelet[2926]: E0119 09:27:13.138868 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 19 09:27:13.138984 kubelet[2926]: E0119 09:27:13.138933 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:27:15.228672 containerd[1676]: time="2026-01-19T09:27:15.228569456Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:27:15.668583 containerd[1676]: time="2026-01-19T09:27:15.668224703Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:27:15.671798 containerd[1676]: time="2026-01-19T09:27:15.671747357Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:27:15.672185 containerd[1676]: time="2026-01-19T09:27:15.671848045Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:27:15.672913 kubelet[2926]: E0119 09:27:15.672678 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:27:15.672913 kubelet[2926]: E0119 09:27:15.672755 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:27:15.672913 kubelet[2926]: E0119 09:27:15.672847 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-558fc744bc-cf5rv_calico-apiserver(37c84e8a-a62d-48e6-91f5-791aabccbba8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:27:15.674582 kubelet[2926]: E0119 09:27:15.673715 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:27:17.228181 containerd[1676]: time="2026-01-19T09:27:17.226270928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:27:17.723082 containerd[1676]: time="2026-01-19T09:27:17.722838357Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:27:17.725421 containerd[1676]: time="2026-01-19T09:27:17.725282257Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:27:17.725421 containerd[1676]: time="2026-01-19T09:27:17.725329066Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:27:17.726695 kubelet[2926]: E0119 09:27:17.726620 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:27:17.726695 kubelet[2926]: E0119 09:27:17.726672 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:27:17.727822 kubelet[2926]: E0119 09:27:17.727333 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-67d5ff6dcb-sg9sb_calico-apiserver(4ce7f23f-45be-4728-a485-c7ae56af8a5c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:27:17.727822 kubelet[2926]: E0119 09:27:17.727383 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:27:18.227621 containerd[1676]: time="2026-01-19T09:27:18.227498129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 19 09:27:18.649586 containerd[1676]: time="2026-01-19T09:27:18.649493292Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:27:18.650992 containerd[1676]: time="2026-01-19T09:27:18.650916013Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 19 09:27:18.651159 containerd[1676]: time="2026-01-19T09:27:18.651041301Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 19 09:27:18.651495 kubelet[2926]: E0119 09:27:18.651272 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 19 09:27:18.651495 kubelet[2926]: E0119 09:27:18.651360 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 19 09:27:18.651691 kubelet[2926]: E0119 09:27:18.651629 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6bcfd67b5c-6xwgh_calico-system(c1023597-3f97-435e-a787-6c49b8480f62): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 19 09:27:18.651760 kubelet[2926]: E0119 09:27:18.651705 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:27:18.652500 containerd[1676]: time="2026-01-19T09:27:18.652247097Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 19 09:27:19.106413 containerd[1676]: time="2026-01-19T09:27:19.106342814Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:27:19.109909 containerd[1676]: time="2026-01-19T09:27:19.109861626Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 19 09:27:19.110008 containerd[1676]: time="2026-01-19T09:27:19.109952414Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 19 09:27:19.110195 kubelet[2926]: E0119 09:27:19.110158 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:27:19.110618 kubelet[2926]: E0119 09:27:19.110227 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:27:19.110618 kubelet[2926]: E0119 09:27:19.110291 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-l9tzs_calico-system(d4703ae1-6e1e-4185-acf2-7e68ee8729a3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 19 09:27:19.110618 kubelet[2926]: E0119 09:27:19.110320 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:27:21.225548 containerd[1676]: time="2026-01-19T09:27:21.225507912Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:27:21.659947 containerd[1676]: time="2026-01-19T09:27:21.659822510Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:27:21.661481 containerd[1676]: time="2026-01-19T09:27:21.661414812Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:27:21.661625 containerd[1676]: time="2026-01-19T09:27:21.661534759Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:27:21.661934 kubelet[2926]: E0119 09:27:21.661865 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:27:21.663170 kubelet[2926]: E0119 09:27:21.661943 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:27:21.663170 kubelet[2926]: E0119 09:27:21.662137 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-67d5ff6dcb-bnwh7_calico-apiserver(911306cd-bd01-4287-9129-4bbfd31bbd16): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:27:21.663170 kubelet[2926]: E0119 09:27:21.662637 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:27:24.232535 kubelet[2926]: E0119 09:27:24.231860 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:27:25.225684 containerd[1676]: time="2026-01-19T09:27:25.225635336Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 19 09:27:25.654732 containerd[1676]: time="2026-01-19T09:27:25.654510110Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:27:25.656224 containerd[1676]: time="2026-01-19T09:27:25.656190492Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 19 09:27:25.656423 containerd[1676]: time="2026-01-19T09:27:25.656329350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 19 09:27:25.656870 kubelet[2926]: E0119 09:27:25.656709 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 19 09:27:25.656870 kubelet[2926]: E0119 09:27:25.656750 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 19 09:27:25.658198 kubelet[2926]: E0119 09:27:25.657288 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 19 09:27:25.659098 containerd[1676]: time="2026-01-19T09:27:25.659070314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 19 09:27:26.096322 containerd[1676]: time="2026-01-19T09:27:26.096259448Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:27:26.098363 containerd[1676]: time="2026-01-19T09:27:26.098320545Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 19 09:27:26.098528 containerd[1676]: time="2026-01-19T09:27:26.098348134Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 19 09:27:26.098720 kubelet[2926]: E0119 09:27:26.098643 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 19 09:27:26.098720 kubelet[2926]: E0119 09:27:26.098685 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 19 09:27:26.098892 kubelet[2926]: E0119 09:27:26.098832 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 19 09:27:26.098892 kubelet[2926]: E0119 09:27:26.098882 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:27:27.226135 kubelet[2926]: E0119 09:27:27.225936 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:27:29.226798 kubelet[2926]: E0119 09:27:29.226595 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:27:31.225871 kubelet[2926]: E0119 09:27:31.225789 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:27:32.229644 kubelet[2926]: E0119 09:27:32.229600 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:27:34.225919 kubelet[2926]: E0119 09:27:34.225803 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:27:37.960202 systemd[1]: Started sshd@8-77.42.19.180:22-4.153.228.146:42620.service - OpenSSH per-connection server daemon (4.153.228.146:42620). Jan 19 09:27:37.975861 kernel: kauditd_printk_skb: 183 callbacks suppressed Jan 19 09:27:37.976048 kernel: audit: type=1130 audit(1768814857.958:760): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.19.180:22-4.153.228.146:42620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:37.958000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.19.180:22-4.153.228.146:42620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:38.252494 kubelet[2926]: E0119 09:27:38.251482 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:27:38.667114 sshd[5135]: Accepted publickey for core from 4.153.228.146 port 42620 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:27:38.674613 kernel: audit: type=1101 audit(1768814858.665:761): pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:38.665000 audit[5135]: USER_ACCT pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:38.676422 sshd-session[5135]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:27:38.673000 audit[5135]: CRED_ACQ pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:38.685456 kernel: audit: type=1103 audit(1768814858.673:762): pid=5135 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:38.691425 kernel: audit: type=1006 audit(1768814858.673:763): pid=5135 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Jan 19 09:27:38.673000 audit[5135]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcfc634ab0 a2=3 a3=0 items=0 ppid=1 pid=5135 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:27:38.695920 systemd-logind[1647]: New session 9 of user core. Jan 19 09:27:38.698420 kernel: audit: type=1300 audit(1768814858.673:763): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcfc634ab0 a2=3 a3=0 items=0 ppid=1 pid=5135 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:27:38.673000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:27:38.703424 kernel: audit: type=1327 audit(1768814858.673:763): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:27:38.705080 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 19 09:27:38.720458 kernel: audit: type=1105 audit(1768814858.712:764): pid=5135 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:38.712000 audit[5135]: USER_START pid=5135 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:38.714000 audit[5139]: CRED_ACQ pid=5139 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:38.728419 kernel: audit: type=1103 audit(1768814858.714:765): pid=5139 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:39.156368 sshd[5139]: Connection closed by 4.153.228.146 port 42620 Jan 19 09:27:39.158113 sshd-session[5135]: pam_unix(sshd:session): session closed for user core Jan 19 09:27:39.159000 audit[5135]: USER_END pid=5135 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:39.163711 systemd[1]: sshd@8-77.42.19.180:22-4.153.228.146:42620.service: Deactivated successfully. Jan 19 09:27:39.169555 kernel: audit: type=1106 audit(1768814859.159:766): pid=5135 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:39.167499 systemd[1]: session-9.scope: Deactivated successfully. Jan 19 09:27:39.172019 systemd-logind[1647]: Session 9 logged out. Waiting for processes to exit. Jan 19 09:27:39.159000 audit[5135]: CRED_DISP pid=5135 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:39.174172 systemd-logind[1647]: Removed session 9. Jan 19 09:27:39.180440 kernel: audit: type=1104 audit(1768814859.159:767): pid=5135 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:39.162000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-77.42.19.180:22-4.153.228.146:42620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:41.226106 kubelet[2926]: E0119 09:27:41.225978 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:27:41.226847 kubelet[2926]: E0119 09:27:41.226583 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:27:41.229629 kubelet[2926]: E0119 09:27:41.228386 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:27:43.226941 kubelet[2926]: E0119 09:27:43.226636 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:27:44.299690 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 19 09:27:44.299877 kernel: audit: type=1130 audit(1768814864.290:769): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.19.180:22-4.153.228.146:42626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:44.290000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.19.180:22-4.153.228.146:42626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:44.292000 systemd[1]: Started sshd@9-77.42.19.180:22-4.153.228.146:42626.service - OpenSSH per-connection server daemon (4.153.228.146:42626). Jan 19 09:27:44.948000 audit[5155]: USER_ACCT pid=5155 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:44.954388 sshd-session[5155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:27:44.956147 sshd[5155]: Accepted publickey for core from 4.153.228.146 port 42626 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:27:44.957881 kernel: audit: type=1101 audit(1768814864.948:770): pid=5155 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:44.950000 audit[5155]: CRED_ACQ pid=5155 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:44.970414 kernel: audit: type=1103 audit(1768814864.950:771): pid=5155 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:44.975278 systemd-logind[1647]: New session 10 of user core. Jan 19 09:27:44.984430 kernel: audit: type=1006 audit(1768814864.950:772): pid=5155 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 19 09:27:44.985775 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 19 09:27:44.950000 audit[5155]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe420af3c0 a2=3 a3=0 items=0 ppid=1 pid=5155 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:27:44.997450 kernel: audit: type=1300 audit(1768814864.950:772): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe420af3c0 a2=3 a3=0 items=0 ppid=1 pid=5155 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:27:44.950000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:27:45.005440 kernel: audit: type=1327 audit(1768814864.950:772): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:27:44.998000 audit[5155]: USER_START pid=5155 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:45.004000 audit[5159]: CRED_ACQ pid=5159 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:45.019163 kernel: audit: type=1105 audit(1768814864.998:773): pid=5155 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:45.019249 kernel: audit: type=1103 audit(1768814865.004:774): pid=5159 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:45.226168 kubelet[2926]: E0119 09:27:45.225735 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:27:45.410418 sshd[5159]: Connection closed by 4.153.228.146 port 42626 Jan 19 09:27:45.408930 sshd-session[5155]: pam_unix(sshd:session): session closed for user core Jan 19 09:27:45.408000 audit[5155]: USER_END pid=5155 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:45.417018 systemd[1]: sshd@9-77.42.19.180:22-4.153.228.146:42626.service: Deactivated successfully. Jan 19 09:27:45.419696 kernel: audit: type=1106 audit(1768814865.408:775): pid=5155 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:45.421091 systemd[1]: session-10.scope: Deactivated successfully. Jan 19 09:27:45.409000 audit[5155]: CRED_DISP pid=5155 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:45.425606 systemd-logind[1647]: Session 10 logged out. Waiting for processes to exit. Jan 19 09:27:45.427733 systemd-logind[1647]: Removed session 10. Jan 19 09:27:45.429130 kernel: audit: type=1104 audit(1768814865.409:776): pid=5155 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:45.412000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-77.42.19.180:22-4.153.228.146:42626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:48.227570 kubelet[2926]: E0119 09:27:48.227506 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:27:50.546000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.19.180:22-4.153.228.146:53692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:50.549555 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 19 09:27:50.549617 kernel: audit: type=1130 audit(1768814870.546:778): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.19.180:22-4.153.228.146:53692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:50.547853 systemd[1]: Started sshd@10-77.42.19.180:22-4.153.228.146:53692.service - OpenSSH per-connection server daemon (4.153.228.146:53692). Jan 19 09:27:51.228000 audit[5175]: USER_ACCT pid=5175 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.232315 sshd[5175]: Accepted publickey for core from 4.153.228.146 port 53692 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:27:51.235530 sshd-session[5175]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:27:51.231000 audit[5175]: CRED_ACQ pid=5175 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.238986 kernel: audit: type=1101 audit(1768814871.228:779): pid=5175 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.239095 kernel: audit: type=1103 audit(1768814871.231:780): pid=5175 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.255993 kernel: audit: type=1006 audit(1768814871.231:781): pid=5175 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 19 09:27:51.256057 kernel: audit: type=1300 audit(1768814871.231:781): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff432e9f00 a2=3 a3=0 items=0 ppid=1 pid=5175 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:27:51.231000 audit[5175]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff432e9f00 a2=3 a3=0 items=0 ppid=1 pid=5175 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:27:51.231000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:27:51.260423 kernel: audit: type=1327 audit(1768814871.231:781): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:27:51.262682 systemd-logind[1647]: New session 11 of user core. Jan 19 09:27:51.266553 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 19 09:27:51.274000 audit[5175]: USER_START pid=5175 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.283438 kernel: audit: type=1105 audit(1768814871.274:782): pid=5175 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.282000 audit[5179]: CRED_ACQ pid=5179 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.290523 kernel: audit: type=1103 audit(1768814871.282:783): pid=5179 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.660373 sshd[5179]: Connection closed by 4.153.228.146 port 53692 Jan 19 09:27:51.662511 sshd-session[5175]: pam_unix(sshd:session): session closed for user core Jan 19 09:27:51.663000 audit[5175]: USER_END pid=5175 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.669484 systemd[1]: sshd@10-77.42.19.180:22-4.153.228.146:53692.service: Deactivated successfully. Jan 19 09:27:51.673459 kernel: audit: type=1106 audit(1768814871.663:784): pid=5175 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.672604 systemd[1]: session-11.scope: Deactivated successfully. Jan 19 09:27:51.664000 audit[5175]: CRED_DISP pid=5175 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.679675 systemd-logind[1647]: Session 11 logged out. Waiting for processes to exit. Jan 19 09:27:51.668000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-77.42.19.180:22-4.153.228.146:53692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:51.680464 kernel: audit: type=1104 audit(1768814871.664:785): pid=5175 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:51.680639 systemd-logind[1647]: Removed session 11. Jan 19 09:27:51.796959 systemd[1]: Started sshd@11-77.42.19.180:22-4.153.228.146:53700.service - OpenSSH per-connection server daemon (4.153.228.146:53700). Jan 19 09:27:51.795000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.19.180:22-4.153.228.146:53700 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:52.227669 kubelet[2926]: E0119 09:27:52.227181 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:27:52.461000 audit[5192]: USER_ACCT pid=5192 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:52.464882 sshd[5192]: Accepted publickey for core from 4.153.228.146 port 53700 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:27:52.464000 audit[5192]: CRED_ACQ pid=5192 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:52.464000 audit[5192]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe80a24d80 a2=3 a3=0 items=0 ppid=1 pid=5192 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:27:52.464000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:27:52.468638 sshd-session[5192]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:27:52.478930 systemd-logind[1647]: New session 12 of user core. Jan 19 09:27:52.484765 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 19 09:27:52.488000 audit[5192]: USER_START pid=5192 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:52.491000 audit[5200]: CRED_ACQ pid=5200 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:52.921240 sshd[5200]: Connection closed by 4.153.228.146 port 53700 Jan 19 09:27:52.922366 sshd-session[5192]: pam_unix(sshd:session): session closed for user core Jan 19 09:27:52.923000 audit[5192]: USER_END pid=5192 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:52.923000 audit[5192]: CRED_DISP pid=5192 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:52.927555 systemd[1]: sshd@11-77.42.19.180:22-4.153.228.146:53700.service: Deactivated successfully. Jan 19 09:27:52.926000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-77.42.19.180:22-4.153.228.146:53700 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:52.931591 systemd[1]: session-12.scope: Deactivated successfully. Jan 19 09:27:52.933947 systemd-logind[1647]: Session 12 logged out. Waiting for processes to exit. Jan 19 09:27:52.936715 systemd-logind[1647]: Removed session 12. Jan 19 09:27:53.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.19.180:22-4.153.228.146:53708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:53.055474 systemd[1]: Started sshd@12-77.42.19.180:22-4.153.228.146:53708.service - OpenSSH per-connection server daemon (4.153.228.146:53708). Jan 19 09:27:53.709000 audit[5210]: USER_ACCT pid=5210 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:53.710998 sshd[5210]: Accepted publickey for core from 4.153.228.146 port 53708 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:27:53.711000 audit[5210]: CRED_ACQ pid=5210 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:53.711000 audit[5210]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff346866b0 a2=3 a3=0 items=0 ppid=1 pid=5210 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:27:53.711000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:27:53.714893 sshd-session[5210]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:27:53.722980 systemd-logind[1647]: New session 13 of user core. Jan 19 09:27:53.728614 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 19 09:27:53.732000 audit[5210]: USER_START pid=5210 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:53.735000 audit[5214]: CRED_ACQ pid=5214 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:54.219861 sshd[5214]: Connection closed by 4.153.228.146 port 53708 Jan 19 09:27:54.220577 sshd-session[5210]: pam_unix(sshd:session): session closed for user core Jan 19 09:27:54.221000 audit[5210]: USER_END pid=5210 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:54.221000 audit[5210]: CRED_DISP pid=5210 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:27:54.224620 systemd-logind[1647]: Session 13 logged out. Waiting for processes to exit. Jan 19 09:27:54.227850 systemd[1]: sshd@12-77.42.19.180:22-4.153.228.146:53708.service: Deactivated successfully. Jan 19 09:27:54.227000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-77.42.19.180:22-4.153.228.146:53708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:54.231185 systemd[1]: session-13.scope: Deactivated successfully. Jan 19 09:27:54.231758 kubelet[2926]: E0119 09:27:54.231708 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:27:54.233356 kubelet[2926]: E0119 09:27:54.232967 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:27:54.234961 systemd-logind[1647]: Removed session 13. Jan 19 09:27:55.226650 kubelet[2926]: E0119 09:27:55.226579 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:27:55.227793 kubelet[2926]: E0119 09:27:55.227446 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:27:58.224741 kubelet[2926]: E0119 09:27:58.224644 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:27:59.362564 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 19 09:27:59.362661 kernel: audit: type=1130 audit(1768814879.353:805): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.19.180:22-4.153.228.146:37954 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:59.353000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.19.180:22-4.153.228.146:37954 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:27:59.354619 systemd[1]: Started sshd@13-77.42.19.180:22-4.153.228.146:37954.service - OpenSSH per-connection server daemon (4.153.228.146:37954). Jan 19 09:28:00.019000 audit[5234]: USER_ACCT pid=5234 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.026867 sshd-session[5234]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:28:00.031225 sshd[5234]: Accepted publickey for core from 4.153.228.146 port 37954 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:28:00.033830 kernel: audit: type=1101 audit(1768814880.019:806): pid=5234 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.019000 audit[5234]: CRED_ACQ pid=5234 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.042861 systemd-logind[1647]: New session 14 of user core. Jan 19 09:28:00.049016 kernel: audit: type=1103 audit(1768814880.019:807): pid=5234 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.058637 kernel: audit: type=1006 audit(1768814880.019:808): pid=5234 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 19 09:28:00.019000 audit[5234]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff698c3d90 a2=3 a3=0 items=0 ppid=1 pid=5234 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:00.059793 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 19 09:28:00.019000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:00.074508 kernel: audit: type=1300 audit(1768814880.019:808): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff698c3d90 a2=3 a3=0 items=0 ppid=1 pid=5234 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:00.074621 kernel: audit: type=1327 audit(1768814880.019:808): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:00.072000 audit[5234]: USER_START pid=5234 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.081298 kernel: audit: type=1105 audit(1768814880.072:809): pid=5234 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.091541 kernel: audit: type=1103 audit(1768814880.079:810): pid=5240 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.079000 audit[5240]: CRED_ACQ pid=5240 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.507441 sshd[5240]: Connection closed by 4.153.228.146 port 37954 Jan 19 09:28:00.506128 sshd-session[5234]: pam_unix(sshd:session): session closed for user core Jan 19 09:28:00.510000 audit[5234]: USER_END pid=5234 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.533537 kernel: audit: type=1106 audit(1768814880.510:811): pid=5234 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.548625 kernel: audit: type=1104 audit(1768814880.510:812): pid=5234 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.510000 audit[5234]: CRED_DISP pid=5234 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:00.549167 systemd[1]: sshd@13-77.42.19.180:22-4.153.228.146:37954.service: Deactivated successfully. Jan 19 09:28:00.552735 systemd[1]: session-14.scope: Deactivated successfully. Jan 19 09:28:00.556459 systemd-logind[1647]: Session 14 logged out. Waiting for processes to exit. Jan 19 09:28:00.548000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-77.42.19.180:22-4.153.228.146:37954 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:00.560154 systemd-logind[1647]: Removed session 14. Jan 19 09:28:00.642493 systemd[1]: Started sshd@14-77.42.19.180:22-4.153.228.146:37960.service - OpenSSH per-connection server daemon (4.153.228.146:37960). Jan 19 09:28:00.641000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-77.42.19.180:22-4.153.228.146:37960 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:01.335000 audit[5252]: USER_ACCT pid=5252 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:01.337422 sshd[5252]: Accepted publickey for core from 4.153.228.146 port 37960 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:28:01.337000 audit[5252]: CRED_ACQ pid=5252 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:01.337000 audit[5252]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffe2af0ae0 a2=3 a3=0 items=0 ppid=1 pid=5252 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:01.337000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:01.340147 sshd-session[5252]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:28:01.348128 systemd-logind[1647]: New session 15 of user core. Jan 19 09:28:01.353597 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 19 09:28:01.358000 audit[5252]: USER_START pid=5252 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:01.361000 audit[5256]: CRED_ACQ pid=5256 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:01.982342 sshd[5256]: Connection closed by 4.153.228.146 port 37960 Jan 19 09:28:01.983289 sshd-session[5252]: pam_unix(sshd:session): session closed for user core Jan 19 09:28:01.984000 audit[5252]: USER_END pid=5252 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:01.985000 audit[5252]: CRED_DISP pid=5252 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:01.990054 systemd-logind[1647]: Session 15 logged out. Waiting for processes to exit. Jan 19 09:28:01.991091 systemd[1]: sshd@14-77.42.19.180:22-4.153.228.146:37960.service: Deactivated successfully. Jan 19 09:28:01.990000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-77.42.19.180:22-4.153.228.146:37960 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:01.994312 systemd[1]: session-15.scope: Deactivated successfully. Jan 19 09:28:01.997215 systemd-logind[1647]: Removed session 15. Jan 19 09:28:02.121855 systemd[1]: Started sshd@15-77.42.19.180:22-4.153.228.146:37966.service - OpenSSH per-connection server daemon (4.153.228.146:37966). Jan 19 09:28:02.120000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-77.42.19.180:22-4.153.228.146:37966 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:02.251722 containerd[1676]: time="2026-01-19T09:28:02.251402727Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:28:02.708923 containerd[1676]: time="2026-01-19T09:28:02.708615099Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:28:02.710798 containerd[1676]: time="2026-01-19T09:28:02.710599524Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:28:02.710798 containerd[1676]: time="2026-01-19T09:28:02.710744313Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:28:02.712167 kubelet[2926]: E0119 09:28:02.711160 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:28:02.712167 kubelet[2926]: E0119 09:28:02.711230 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:28:02.712167 kubelet[2926]: E0119 09:28:02.711354 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-67d5ff6dcb-bnwh7_calico-apiserver(911306cd-bd01-4287-9129-4bbfd31bbd16): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:28:02.712167 kubelet[2926]: E0119 09:28:02.711472 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:28:02.811000 audit[5266]: USER_ACCT pid=5266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:02.812947 sshd[5266]: Accepted publickey for core from 4.153.228.146 port 37966 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:28:02.812000 audit[5266]: CRED_ACQ pid=5266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:02.812000 audit[5266]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe2cf30c30 a2=3 a3=0 items=0 ppid=1 pid=5266 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:02.812000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:02.815768 sshd-session[5266]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:28:02.823339 systemd-logind[1647]: New session 16 of user core. Jan 19 09:28:02.828919 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 19 09:28:02.832000 audit[5266]: USER_START pid=5266 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:02.834000 audit[5270]: CRED_ACQ pid=5270 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:03.601000 audit[5305]: NETFILTER_CFG table=filter:141 family=2 entries=26 op=nft_register_rule pid=5305 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:28:03.601000 audit[5305]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffde1b9d760 a2=0 a3=7ffde1b9d74c items=0 ppid=3032 pid=5305 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:03.601000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:28:03.607000 audit[5305]: NETFILTER_CFG table=nat:142 family=2 entries=20 op=nft_register_rule pid=5305 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:28:03.607000 audit[5305]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffde1b9d760 a2=0 a3=0 items=0 ppid=3032 pid=5305 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:03.607000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:28:03.728598 sshd[5270]: Connection closed by 4.153.228.146 port 37966 Jan 19 09:28:03.729040 sshd-session[5266]: pam_unix(sshd:session): session closed for user core Jan 19 09:28:03.730000 audit[5266]: USER_END pid=5266 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:03.730000 audit[5266]: CRED_DISP pid=5266 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:03.734480 systemd[1]: sshd@15-77.42.19.180:22-4.153.228.146:37966.service: Deactivated successfully. Jan 19 09:28:03.733000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-77.42.19.180:22-4.153.228.146:37966 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:03.738111 systemd[1]: session-16.scope: Deactivated successfully. Jan 19 09:28:03.740506 systemd-logind[1647]: Session 16 logged out. Waiting for processes to exit. Jan 19 09:28:03.741769 systemd-logind[1647]: Removed session 16. Jan 19 09:28:03.867427 systemd[1]: Started sshd@16-77.42.19.180:22-4.153.228.146:37968.service - OpenSSH per-connection server daemon (4.153.228.146:37968). Jan 19 09:28:03.866000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.19.180:22-4.153.228.146:37968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:04.535450 kernel: kauditd_printk_skb: 30 callbacks suppressed Jan 19 09:28:04.535562 kernel: audit: type=1101 audit(1768814884.526:835): pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:04.526000 audit[5310]: USER_ACCT pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:04.531854 sshd-session[5310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:28:04.535919 sshd[5310]: Accepted publickey for core from 4.153.228.146 port 37968 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:28:04.546559 kernel: audit: type=1103 audit(1768814884.529:836): pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:04.529000 audit[5310]: CRED_ACQ pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:04.542893 systemd-logind[1647]: New session 17 of user core. Jan 19 09:28:04.558317 kernel: audit: type=1006 audit(1768814884.529:837): pid=5310 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 19 09:28:04.558440 kernel: audit: type=1300 audit(1768814884.529:837): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc5c3cc6b0 a2=3 a3=0 items=0 ppid=1 pid=5310 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:04.529000 audit[5310]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc5c3cc6b0 a2=3 a3=0 items=0 ppid=1 pid=5310 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:04.554757 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 19 09:28:04.529000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:04.561465 kernel: audit: type=1327 audit(1768814884.529:837): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:04.560000 audit[5310]: USER_START pid=5310 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:04.569446 kernel: audit: type=1105 audit(1768814884.560:838): pid=5310 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:04.568000 audit[5314]: CRED_ACQ pid=5314 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:04.575461 kernel: audit: type=1103 audit(1768814884.568:839): pid=5314 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:04.627000 audit[5316]: NETFILTER_CFG table=filter:143 family=2 entries=38 op=nft_register_rule pid=5316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:28:04.633413 kernel: audit: type=1325 audit(1768814884.627:840): table=filter:143 family=2 entries=38 op=nft_register_rule pid=5316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:28:04.627000 audit[5316]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fff95e16140 a2=0 a3=7fff95e1612c items=0 ppid=3032 pid=5316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:04.640579 kernel: audit: type=1300 audit(1768814884.627:840): arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fff95e16140 a2=0 a3=7fff95e1612c items=0 ppid=3032 pid=5316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:04.627000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:28:04.643467 kernel: audit: type=1327 audit(1768814884.627:840): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:28:04.640000 audit[5316]: NETFILTER_CFG table=nat:144 family=2 entries=20 op=nft_register_rule pid=5316 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:28:04.640000 audit[5316]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff95e16140 a2=0 a3=0 items=0 ppid=3032 pid=5316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:04.640000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:28:05.145435 sshd[5314]: Connection closed by 4.153.228.146 port 37968 Jan 19 09:28:05.144486 sshd-session[5310]: pam_unix(sshd:session): session closed for user core Jan 19 09:28:05.144000 audit[5310]: USER_END pid=5310 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:05.144000 audit[5310]: CRED_DISP pid=5310 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:05.152033 systemd[1]: sshd@16-77.42.19.180:22-4.153.228.146:37968.service: Deactivated successfully. Jan 19 09:28:05.151000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-77.42.19.180:22-4.153.228.146:37968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:05.155777 systemd-logind[1647]: Session 17 logged out. Waiting for processes to exit. Jan 19 09:28:05.159033 systemd[1]: session-17.scope: Deactivated successfully. Jan 19 09:28:05.165262 systemd-logind[1647]: Removed session 17. Jan 19 09:28:05.225582 containerd[1676]: time="2026-01-19T09:28:05.225541018Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 19 09:28:05.285723 systemd[1]: Started sshd@17-77.42.19.180:22-4.153.228.146:39872.service - OpenSSH per-connection server daemon (4.153.228.146:39872). Jan 19 09:28:05.284000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-77.42.19.180:22-4.153.228.146:39872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:05.680199 containerd[1676]: time="2026-01-19T09:28:05.680139805Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:28:05.681758 containerd[1676]: time="2026-01-19T09:28:05.681710504Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 19 09:28:05.681839 containerd[1676]: time="2026-01-19T09:28:05.681820083Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 19 09:28:05.682041 kubelet[2926]: E0119 09:28:05.681995 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:28:05.682354 kubelet[2926]: E0119 09:28:05.682057 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:28:05.682354 kubelet[2926]: E0119 09:28:05.682145 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 19 09:28:05.684119 containerd[1676]: time="2026-01-19T09:28:05.683768839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 19 09:28:05.966000 audit[5326]: USER_ACCT pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:05.969682 sshd[5326]: Accepted publickey for core from 4.153.228.146 port 39872 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:28:05.970000 audit[5326]: CRED_ACQ pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:05.970000 audit[5326]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc38a0e9d0 a2=3 a3=0 items=0 ppid=1 pid=5326 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:05.970000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:05.973784 sshd-session[5326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:28:05.984802 systemd-logind[1647]: New session 18 of user core. Jan 19 09:28:05.987709 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 19 09:28:05.991000 audit[5326]: USER_START pid=5326 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:05.994000 audit[5330]: CRED_ACQ pid=5330 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:06.342101 containerd[1676]: time="2026-01-19T09:28:06.342044476Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:28:06.343597 containerd[1676]: time="2026-01-19T09:28:06.343548745Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 19 09:28:06.343693 containerd[1676]: time="2026-01-19T09:28:06.343637444Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 19 09:28:06.343859 kubelet[2926]: E0119 09:28:06.343815 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:28:06.343916 kubelet[2926]: E0119 09:28:06.343866 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:28:06.344079 kubelet[2926]: E0119 09:28:06.344051 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 19 09:28:06.344134 kubelet[2926]: E0119 09:28:06.344097 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:28:06.344748 containerd[1676]: time="2026-01-19T09:28:06.344691297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:28:06.408833 sshd[5330]: Connection closed by 4.153.228.146 port 39872 Jan 19 09:28:06.409577 sshd-session[5326]: pam_unix(sshd:session): session closed for user core Jan 19 09:28:06.409000 audit[5326]: USER_END pid=5326 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:06.409000 audit[5326]: CRED_DISP pid=5326 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:06.413562 systemd-logind[1647]: Session 18 logged out. Waiting for processes to exit. Jan 19 09:28:06.415291 systemd[1]: sshd@17-77.42.19.180:22-4.153.228.146:39872.service: Deactivated successfully. Jan 19 09:28:06.414000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-77.42.19.180:22-4.153.228.146:39872 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:06.417997 systemd[1]: session-18.scope: Deactivated successfully. Jan 19 09:28:06.419847 systemd-logind[1647]: Removed session 18. Jan 19 09:28:06.772030 containerd[1676]: time="2026-01-19T09:28:06.771268678Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:28:06.773002 containerd[1676]: time="2026-01-19T09:28:06.772926316Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:28:06.773112 containerd[1676]: time="2026-01-19T09:28:06.773034535Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:28:06.773387 kubelet[2926]: E0119 09:28:06.773295 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:28:06.773387 kubelet[2926]: E0119 09:28:06.773349 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:28:06.774088 kubelet[2926]: E0119 09:28:06.773452 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-558fc744bc-cf5rv_calico-apiserver(37c84e8a-a62d-48e6-91f5-791aabccbba8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:28:06.774088 kubelet[2926]: E0119 09:28:06.773490 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:28:07.225557 containerd[1676]: time="2026-01-19T09:28:07.225505914Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:28:07.682318 containerd[1676]: time="2026-01-19T09:28:07.682255569Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:28:07.684129 containerd[1676]: time="2026-01-19T09:28:07.683555950Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:28:07.684129 containerd[1676]: time="2026-01-19T09:28:07.683670789Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:28:07.684259 kubelet[2926]: E0119 09:28:07.683869 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:28:07.684259 kubelet[2926]: E0119 09:28:07.683919 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:28:07.684259 kubelet[2926]: E0119 09:28:07.684229 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-67d5ff6dcb-sg9sb_calico-apiserver(4ce7f23f-45be-4728-a485-c7ae56af8a5c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:28:07.684917 kubelet[2926]: E0119 09:28:07.684267 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:28:07.685636 containerd[1676]: time="2026-01-19T09:28:07.685603844Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 19 09:28:08.189105 containerd[1676]: time="2026-01-19T09:28:08.189031049Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:28:08.191003 containerd[1676]: time="2026-01-19T09:28:08.190887686Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 19 09:28:08.191167 containerd[1676]: time="2026-01-19T09:28:08.191088565Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 19 09:28:08.191854 kubelet[2926]: E0119 09:28:08.191741 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:28:08.191854 kubelet[2926]: E0119 09:28:08.191814 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:28:08.194242 kubelet[2926]: E0119 09:28:08.191919 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-l9tzs_calico-system(d4703ae1-6e1e-4185-acf2-7e68ee8729a3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 19 09:28:08.194242 kubelet[2926]: E0119 09:28:08.191984 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:28:08.231011 containerd[1676]: time="2026-01-19T09:28:08.229721485Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 19 09:28:08.664919 containerd[1676]: time="2026-01-19T09:28:08.664839219Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:28:08.670480 containerd[1676]: time="2026-01-19T09:28:08.670406390Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 19 09:28:08.670619 containerd[1676]: time="2026-01-19T09:28:08.670530039Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 19 09:28:08.670921 kubelet[2926]: E0119 09:28:08.670869 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 19 09:28:08.671954 kubelet[2926]: E0119 09:28:08.671448 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 19 09:28:08.671954 kubelet[2926]: E0119 09:28:08.671566 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 19 09:28:08.674295 containerd[1676]: time="2026-01-19T09:28:08.674255713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 19 09:28:08.909000 audit[5356]: NETFILTER_CFG table=filter:145 family=2 entries=26 op=nft_register_rule pid=5356 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:28:08.909000 audit[5356]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff9c72c450 a2=0 a3=7fff9c72c43c items=0 ppid=3032 pid=5356 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:08.909000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:28:08.916000 audit[5356]: NETFILTER_CFG table=nat:146 family=2 entries=104 op=nft_register_chain pid=5356 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 19 09:28:08.916000 audit[5356]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7fff9c72c450 a2=0 a3=7fff9c72c43c items=0 ppid=3032 pid=5356 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:08.916000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 19 09:28:09.105227 containerd[1676]: time="2026-01-19T09:28:09.104537842Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:28:09.108461 containerd[1676]: time="2026-01-19T09:28:09.107493522Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 19 09:28:09.108461 containerd[1676]: time="2026-01-19T09:28:09.107658000Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 19 09:28:09.109196 kubelet[2926]: E0119 09:28:09.109134 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 19 09:28:09.109525 kubelet[2926]: E0119 09:28:09.109480 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 19 09:28:09.109828 kubelet[2926]: E0119 09:28:09.109789 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-km4zx_calico-system(cf97e1c1-8e02-43e7-988e-adabcb45ce04): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 19 09:28:09.110490 kubelet[2926]: E0119 09:28:09.110367 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:28:10.230421 containerd[1676]: time="2026-01-19T09:28:10.229849380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 19 09:28:10.668658 containerd[1676]: time="2026-01-19T09:28:10.668405070Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:28:10.670434 containerd[1676]: time="2026-01-19T09:28:10.670381686Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 19 09:28:10.671576 containerd[1676]: time="2026-01-19T09:28:10.671481469Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 19 09:28:10.672519 kubelet[2926]: E0119 09:28:10.672470 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 19 09:28:10.672997 kubelet[2926]: E0119 09:28:10.672527 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 19 09:28:10.672997 kubelet[2926]: E0119 09:28:10.672598 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6bcfd67b5c-6xwgh_calico-system(c1023597-3f97-435e-a787-6c49b8480f62): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 19 09:28:10.672997 kubelet[2926]: E0119 09:28:10.672638 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:28:11.564666 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 19 09:28:11.564824 kernel: audit: type=1130 audit(1768814891.560:856): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.19.180:22-4.153.228.146:39880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:11.560000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.19.180:22-4.153.228.146:39880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:11.561912 systemd[1]: Started sshd@18-77.42.19.180:22-4.153.228.146:39880.service - OpenSSH per-connection server daemon (4.153.228.146:39880). Jan 19 09:28:12.240994 sshd[5358]: Accepted publickey for core from 4.153.228.146 port 39880 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:28:12.239000 audit[5358]: USER_ACCT pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.245031 sshd-session[5358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:28:12.257048 kernel: audit: type=1101 audit(1768814892.239:857): pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.257225 kernel: audit: type=1103 audit(1768814892.240:858): pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.240000 audit[5358]: CRED_ACQ pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.270821 kernel: audit: type=1006 audit(1768814892.240:859): pid=5358 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 19 09:28:12.270935 kernel: audit: type=1300 audit(1768814892.240:859): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffb21812a0 a2=3 a3=0 items=0 ppid=1 pid=5358 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:12.240000 audit[5358]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffb21812a0 a2=3 a3=0 items=0 ppid=1 pid=5358 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:12.264568 systemd-logind[1647]: New session 19 of user core. Jan 19 09:28:12.274559 kernel: audit: type=1327 audit(1768814892.240:859): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:12.240000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:12.273135 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 19 09:28:12.281000 audit[5358]: USER_START pid=5358 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.294439 kernel: audit: type=1105 audit(1768814892.281:860): pid=5358 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.294000 audit[5369]: CRED_ACQ pid=5369 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.304417 kernel: audit: type=1103 audit(1768814892.294:861): pid=5369 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.712235 sshd[5369]: Connection closed by 4.153.228.146 port 39880 Jan 19 09:28:12.714500 sshd-session[5358]: pam_unix(sshd:session): session closed for user core Jan 19 09:28:12.715000 audit[5358]: USER_END pid=5358 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.724591 systemd[1]: sshd@18-77.42.19.180:22-4.153.228.146:39880.service: Deactivated successfully. Jan 19 09:28:12.726536 kernel: audit: type=1106 audit(1768814892.715:862): pid=5358 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.716000 audit[5358]: CRED_DISP pid=5358 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.733110 systemd[1]: session-19.scope: Deactivated successfully. Jan 19 09:28:12.734466 kernel: audit: type=1104 audit(1768814892.716:863): pid=5358 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:12.725000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-77.42.19.180:22-4.153.228.146:39880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:12.739805 systemd-logind[1647]: Session 19 logged out. Waiting for processes to exit. Jan 19 09:28:12.742808 systemd-logind[1647]: Removed session 19. Jan 19 09:28:13.226295 kubelet[2926]: E0119 09:28:13.226189 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:28:17.854302 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 19 09:28:17.854442 kernel: audit: type=1130 audit(1768814897.845:865): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.19.180:22-4.153.228.146:50042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:17.845000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.19.180:22-4.153.228.146:50042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:17.847125 systemd[1]: Started sshd@19-77.42.19.180:22-4.153.228.146:50042.service - OpenSSH per-connection server daemon (4.153.228.146:50042). Jan 19 09:28:18.497000 audit[5381]: USER_ACCT pid=5381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.506421 kernel: audit: type=1101 audit(1768814898.497:866): pid=5381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.507705 sshd[5381]: Accepted publickey for core from 4.153.228.146 port 50042 ssh2: RSA SHA256:eSExdM4v9/N03grYyJxmpMcfBAsqn6ghNdwGRTmb0HE Jan 19 09:28:18.510049 sshd-session[5381]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 19 09:28:18.507000 audit[5381]: CRED_ACQ pid=5381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.518506 kernel: audit: type=1103 audit(1768814898.507:867): pid=5381 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.524808 kernel: audit: type=1006 audit(1768814898.507:868): pid=5381 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 19 09:28:18.507000 audit[5381]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffbd519280 a2=3 a3=0 items=0 ppid=1 pid=5381 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:18.534678 kernel: audit: type=1300 audit(1768814898.507:868): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffbd519280 a2=3 a3=0 items=0 ppid=1 pid=5381 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:18.537539 systemd-logind[1647]: New session 20 of user core. Jan 19 09:28:18.507000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:18.541418 kernel: audit: type=1327 audit(1768814898.507:868): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 19 09:28:18.543801 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 19 09:28:18.546000 audit[5381]: USER_START pid=5381 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.556741 kernel: audit: type=1105 audit(1768814898.546:869): pid=5381 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.556830 kernel: audit: type=1103 audit(1768814898.554:870): pid=5385 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.554000 audit[5385]: CRED_ACQ pid=5385 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.942420 sshd[5385]: Connection closed by 4.153.228.146 port 50042 Jan 19 09:28:18.944592 sshd-session[5381]: pam_unix(sshd:session): session closed for user core Jan 19 09:28:18.944000 audit[5381]: USER_END pid=5381 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.950361 systemd[1]: sshd@19-77.42.19.180:22-4.153.228.146:50042.service: Deactivated successfully. Jan 19 09:28:18.953299 systemd[1]: session-20.scope: Deactivated successfully. Jan 19 09:28:18.962985 kernel: audit: type=1106 audit(1768814898.944:871): pid=5381 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.963078 kernel: audit: type=1104 audit(1768814898.944:872): pid=5381 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.944000 audit[5381]: CRED_DISP pid=5381 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 19 09:28:18.957938 systemd-logind[1647]: Session 20 logged out. Waiting for processes to exit. Jan 19 09:28:18.959535 systemd-logind[1647]: Removed session 20. Jan 19 09:28:18.949000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-77.42.19.180:22-4.153.228.146:50042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 19 09:28:19.226528 kubelet[2926]: E0119 09:28:19.226324 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:28:20.227644 kubelet[2926]: E0119 09:28:20.227587 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:28:22.228954 kubelet[2926]: E0119 09:28:22.228795 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:28:22.228954 kubelet[2926]: E0119 09:28:22.228882 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:28:23.225926 kubelet[2926]: E0119 09:28:23.225647 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:28:24.229911 kubelet[2926]: E0119 09:28:24.229338 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:28:28.227527 kubelet[2926]: E0119 09:28:28.226937 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:28:30.225383 kubelet[2926]: E0119 09:28:30.225313 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:28:31.226535 kubelet[2926]: E0119 09:28:31.226471 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:28:33.225428 kubelet[2926]: E0119 09:28:33.225353 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:28:35.226576 kubelet[2926]: E0119 09:28:35.226488 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:28:36.005126 systemd[1]: cri-containerd-bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e.scope: Deactivated successfully. Jan 19 09:28:36.005885 systemd[1]: cri-containerd-bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e.scope: Consumed 3.597s CPU time, 66.8M memory peak. Jan 19 09:28:36.005000 audit: BPF prog-id=261 op=LOAD Jan 19 09:28:36.008430 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 19 09:28:36.008557 kernel: audit: type=1334 audit(1768814916.005:874): prog-id=261 op=LOAD Jan 19 09:28:36.011312 containerd[1676]: time="2026-01-19T09:28:36.011040241Z" level=info msg="received container exit event container_id:\"bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e\" id:\"bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e\" pid:2749 exit_status:1 exited_at:{seconds:1768814916 nanos:10670172}" Jan 19 09:28:36.005000 audit: BPF prog-id=83 op=UNLOAD Jan 19 09:28:36.008000 audit: BPF prog-id=98 op=UNLOAD Jan 19 09:28:36.018480 kernel: audit: type=1334 audit(1768814916.005:875): prog-id=83 op=UNLOAD Jan 19 09:28:36.018582 kernel: audit: type=1334 audit(1768814916.008:876): prog-id=98 op=UNLOAD Jan 19 09:28:36.008000 audit: BPF prog-id=102 op=UNLOAD Jan 19 09:28:36.021969 kernel: audit: type=1334 audit(1768814916.008:877): prog-id=102 op=UNLOAD Jan 19 09:28:36.044095 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e-rootfs.mount: Deactivated successfully. Jan 19 09:28:36.222236 kubelet[2926]: E0119 09:28:36.222144 2926 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:57382->10.0.0.2:2379: read: connection timed out" Jan 19 09:28:36.225917 kubelet[2926]: E0119 09:28:36.225852 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:28:36.760087 kubelet[2926]: I0119 09:28:36.760054 2926 scope.go:117] "RemoveContainer" containerID="bcf6a2742f6d0b8b41af109b921083c1d83650af77324cf0d342b6097830124e" Jan 19 09:28:36.761898 containerd[1676]: time="2026-01-19T09:28:36.761859690Z" level=info msg="CreateContainer within sandbox \"d56cce1ea29d845bb3cd4a0cfe1cc111d146b1e361f848116e3745a8f22fe120\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 19 09:28:36.774738 containerd[1676]: time="2026-01-19T09:28:36.774540397Z" level=info msg="Container 8e6c9634ee080d896e0e5c88099797e01fc55de0d94a1fa9ac90b20b2588a986: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:28:36.779087 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount325759405.mount: Deactivated successfully. Jan 19 09:28:36.783261 containerd[1676]: time="2026-01-19T09:28:36.783186603Z" level=info msg="CreateContainer within sandbox \"d56cce1ea29d845bb3cd4a0cfe1cc111d146b1e361f848116e3745a8f22fe120\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"8e6c9634ee080d896e0e5c88099797e01fc55de0d94a1fa9ac90b20b2588a986\"" Jan 19 09:28:36.784448 containerd[1676]: time="2026-01-19T09:28:36.783673160Z" level=info msg="StartContainer for \"8e6c9634ee080d896e0e5c88099797e01fc55de0d94a1fa9ac90b20b2588a986\"" Jan 19 09:28:36.785463 containerd[1676]: time="2026-01-19T09:28:36.785442872Z" level=info msg="connecting to shim 8e6c9634ee080d896e0e5c88099797e01fc55de0d94a1fa9ac90b20b2588a986" address="unix:///run/containerd/s/cea780955e301682bc3016340dafa5dece290eea84c26c619f2ce28192434b95" protocol=ttrpc version=3 Jan 19 09:28:36.790410 containerd[1676]: time="2026-01-19T09:28:36.790364757Z" level=info msg="received container exit event container_id:\"8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22\" id:\"8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22\" pid:3251 exit_status:1 exited_at:{seconds:1768814916 nanos:789849369}" Jan 19 09:28:36.791342 systemd[1]: cri-containerd-8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22.scope: Deactivated successfully. Jan 19 09:28:36.792695 systemd[1]: cri-containerd-8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22.scope: Consumed 16.573s CPU time, 120.2M memory peak. Jan 19 09:28:36.795000 audit: BPF prog-id=146 op=UNLOAD Jan 19 09:28:36.798409 kernel: audit: type=1334 audit(1768814916.795:878): prog-id=146 op=UNLOAD Jan 19 09:28:36.795000 audit: BPF prog-id=150 op=UNLOAD Jan 19 09:28:36.802420 kernel: audit: type=1334 audit(1768814916.795:879): prog-id=150 op=UNLOAD Jan 19 09:28:36.811626 systemd[1]: Started cri-containerd-8e6c9634ee080d896e0e5c88099797e01fc55de0d94a1fa9ac90b20b2588a986.scope - libcontainer container 8e6c9634ee080d896e0e5c88099797e01fc55de0d94a1fa9ac90b20b2588a986. Jan 19 09:28:36.824000 audit: BPF prog-id=262 op=LOAD Jan 19 09:28:36.827594 kernel: audit: type=1334 audit(1768814916.824:880): prog-id=262 op=LOAD Jan 19 09:28:36.827645 kernel: audit: type=1334 audit(1768814916.826:881): prog-id=263 op=LOAD Jan 19 09:28:36.826000 audit: BPF prog-id=263 op=LOAD Jan 19 09:28:36.826000 audit[5436]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2618 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:36.832767 kernel: audit: type=1300 audit(1768814916.826:881): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2618 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:36.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865366339363334656530383064383936653065356338383039393739 Jan 19 09:28:36.839550 kernel: audit: type=1327 audit(1768814916.826:881): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865366339363334656530383064383936653065356338383039393739 Jan 19 09:28:36.826000 audit: BPF prog-id=263 op=UNLOAD Jan 19 09:28:36.826000 audit[5436]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:36.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865366339363334656530383064383936653065356338383039393739 Jan 19 09:28:36.826000 audit: BPF prog-id=264 op=LOAD Jan 19 09:28:36.826000 audit[5436]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2618 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:36.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865366339363334656530383064383936653065356338383039393739 Jan 19 09:28:36.826000 audit: BPF prog-id=265 op=LOAD Jan 19 09:28:36.826000 audit[5436]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2618 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:36.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865366339363334656530383064383936653065356338383039393739 Jan 19 09:28:36.826000 audit: BPF prog-id=265 op=UNLOAD Jan 19 09:28:36.826000 audit[5436]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:36.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865366339363334656530383064383936653065356338383039393739 Jan 19 09:28:36.826000 audit: BPF prog-id=264 op=UNLOAD Jan 19 09:28:36.826000 audit[5436]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2618 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:36.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865366339363334656530383064383936653065356338383039393739 Jan 19 09:28:36.826000 audit: BPF prog-id=266 op=LOAD Jan 19 09:28:36.826000 audit[5436]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2618 pid=5436 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:36.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865366339363334656530383064383936653065356338383039393739 Jan 19 09:28:36.872064 containerd[1676]: time="2026-01-19T09:28:36.871978467Z" level=info msg="StartContainer for \"8e6c9634ee080d896e0e5c88099797e01fc55de0d94a1fa9ac90b20b2588a986\" returns successfully" Jan 19 09:28:37.048687 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22-rootfs.mount: Deactivated successfully. Jan 19 09:28:37.224782 kubelet[2926]: E0119 09:28:37.224744 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:28:37.764472 kubelet[2926]: I0119 09:28:37.764423 2926 scope.go:117] "RemoveContainer" containerID="8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22" Jan 19 09:28:37.771279 containerd[1676]: time="2026-01-19T09:28:37.771233684Z" level=info msg="CreateContainer within sandbox \"e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 19 09:28:37.784200 containerd[1676]: time="2026-01-19T09:28:37.783132774Z" level=info msg="Container 64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:28:37.791323 containerd[1676]: time="2026-01-19T09:28:37.791283594Z" level=info msg="CreateContainer within sandbox \"e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906\"" Jan 19 09:28:37.791879 containerd[1676]: time="2026-01-19T09:28:37.791841471Z" level=info msg="StartContainer for \"64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906\"" Jan 19 09:28:37.792601 containerd[1676]: time="2026-01-19T09:28:37.792572378Z" level=info msg="connecting to shim 64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906" address="unix:///run/containerd/s/c663bd7e7f7ace4214b4e6ba1b09322776babf8109134ce90184dfd1ca37d5d0" protocol=ttrpc version=3 Jan 19 09:28:37.814570 systemd[1]: Started cri-containerd-64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906.scope - libcontainer container 64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906. Jan 19 09:28:37.826000 audit: BPF prog-id=267 op=LOAD Jan 19 09:28:37.827000 audit: BPF prog-id=268 op=LOAD Jan 19 09:28:37.827000 audit[5482]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3063 pid=5482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:37.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393636353436643636646663633737333763646430343761393933 Jan 19 09:28:37.827000 audit: BPF prog-id=268 op=UNLOAD Jan 19 09:28:37.827000 audit[5482]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=5482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:37.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393636353436643636646663633737333763646430343761393933 Jan 19 09:28:37.827000 audit: BPF prog-id=269 op=LOAD Jan 19 09:28:37.827000 audit[5482]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3063 pid=5482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:37.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393636353436643636646663633737333763646430343761393933 Jan 19 09:28:37.827000 audit: BPF prog-id=270 op=LOAD Jan 19 09:28:37.827000 audit[5482]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3063 pid=5482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:37.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393636353436643636646663633737333763646430343761393933 Jan 19 09:28:37.827000 audit: BPF prog-id=270 op=UNLOAD Jan 19 09:28:37.827000 audit[5482]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=5482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:37.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393636353436643636646663633737333763646430343761393933 Jan 19 09:28:37.827000 audit: BPF prog-id=269 op=UNLOAD Jan 19 09:28:37.827000 audit[5482]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=5482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:37.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393636353436643636646663633737333763646430343761393933 Jan 19 09:28:37.827000 audit: BPF prog-id=271 op=LOAD Jan 19 09:28:37.827000 audit[5482]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3063 pid=5482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:37.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393636353436643636646663633737333763646430343761393933 Jan 19 09:28:37.845476 containerd[1676]: time="2026-01-19T09:28:37.845384165Z" level=info msg="StartContainer for \"64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906\" returns successfully" Jan 19 09:28:40.901879 systemd[1]: cri-containerd-12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e.scope: Deactivated successfully. Jan 19 09:28:40.903579 systemd[1]: cri-containerd-12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e.scope: Consumed 2.212s CPU time, 22.3M memory peak. Jan 19 09:28:40.903000 audit: BPF prog-id=103 op=UNLOAD Jan 19 09:28:40.903000 audit: BPF prog-id=107 op=UNLOAD Jan 19 09:28:40.904000 audit: BPF prog-id=272 op=LOAD Jan 19 09:28:40.904000 audit: BPF prog-id=88 op=UNLOAD Jan 19 09:28:40.910108 containerd[1676]: time="2026-01-19T09:28:40.910015357Z" level=info msg="received container exit event container_id:\"12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e\" id:\"12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e\" pid:2761 exit_status:1 exited_at:{seconds:1768814920 nanos:908308795}" Jan 19 09:28:40.959052 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e-rootfs.mount: Deactivated successfully. Jan 19 09:28:41.225292 kubelet[2926]: E0119 09:28:41.225128 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:28:41.780618 kubelet[2926]: I0119 09:28:41.780575 2926 scope.go:117] "RemoveContainer" containerID="12978616055efbac44cee81d13b062b80e36c9ac2a5df26aa3e2053667014e1e" Jan 19 09:28:41.783419 containerd[1676]: time="2026-01-19T09:28:41.782899266Z" level=info msg="CreateContainer within sandbox \"31992d85513ba8ca835eb6dabb2b0e4884baaec119beb6883d661ef94758149d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 19 09:28:41.797192 containerd[1676]: time="2026-01-19T09:28:41.797148537Z" level=info msg="Container cf802ea41479958cc358c673b7b10a60320e78046de7e555d972cd821fe98692: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:28:41.805309 containerd[1676]: time="2026-01-19T09:28:41.805262909Z" level=info msg="CreateContainer within sandbox \"31992d85513ba8ca835eb6dabb2b0e4884baaec119beb6883d661ef94758149d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"cf802ea41479958cc358c673b7b10a60320e78046de7e555d972cd821fe98692\"" Jan 19 09:28:41.805851 containerd[1676]: time="2026-01-19T09:28:41.805822485Z" level=info msg="StartContainer for \"cf802ea41479958cc358c673b7b10a60320e78046de7e555d972cd821fe98692\"" Jan 19 09:28:41.806880 containerd[1676]: time="2026-01-19T09:28:41.806854891Z" level=info msg="connecting to shim cf802ea41479958cc358c673b7b10a60320e78046de7e555d972cd821fe98692" address="unix:///run/containerd/s/e919c13deb0ac45eeadce23447329831c7bd60a1fa7bdf5921cd2cbbe9829f31" protocol=ttrpc version=3 Jan 19 09:28:41.829567 systemd[1]: Started cri-containerd-cf802ea41479958cc358c673b7b10a60320e78046de7e555d972cd821fe98692.scope - libcontainer container cf802ea41479958cc358c673b7b10a60320e78046de7e555d972cd821fe98692. Jan 19 09:28:41.840000 audit: BPF prog-id=273 op=LOAD Jan 19 09:28:41.842732 kernel: kauditd_printk_skb: 44 callbacks suppressed Jan 19 09:28:41.842785 kernel: audit: type=1334 audit(1768814921.840:900): prog-id=273 op=LOAD Jan 19 09:28:41.844000 audit: BPF prog-id=274 op=LOAD Jan 19 09:28:41.844000 audit[5526]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.855751 kernel: audit: type=1334 audit(1768814921.844:901): prog-id=274 op=LOAD Jan 19 09:28:41.855814 kernel: audit: type=1300 audit(1768814921.844:901): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.855839 kernel: audit: type=1327 audit(1768814921.844:901): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.844000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.865235 kernel: audit: type=1334 audit(1768814921.844:902): prog-id=274 op=UNLOAD Jan 19 09:28:41.844000 audit: BPF prog-id=274 op=UNLOAD Jan 19 09:28:41.871937 kernel: audit: type=1300 audit(1768814921.844:902): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.844000 audit[5526]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.874490 kernel: audit: type=1327 audit(1768814921.844:902): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.844000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.880559 kernel: audit: type=1334 audit(1768814921.844:903): prog-id=275 op=LOAD Jan 19 09:28:41.844000 audit: BPF prog-id=275 op=LOAD Jan 19 09:28:41.844000 audit[5526]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.882902 kernel: audit: type=1300 audit(1768814921.844:903): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.893633 kernel: audit: type=1327 audit(1768814921.844:903): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.844000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.844000 audit: BPF prog-id=276 op=LOAD Jan 19 09:28:41.844000 audit[5526]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.844000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.844000 audit: BPF prog-id=276 op=UNLOAD Jan 19 09:28:41.844000 audit[5526]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.844000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.844000 audit: BPF prog-id=275 op=UNLOAD Jan 19 09:28:41.844000 audit[5526]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.844000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.844000 audit: BPF prog-id=277 op=LOAD Jan 19 09:28:41.844000 audit[5526]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=2599 pid=5526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:28:41.844000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366383032656134313437393935386363333538633637336237623130 Jan 19 09:28:41.901633 containerd[1676]: time="2026-01-19T09:28:41.901543974Z" level=info msg="StartContainer for \"cf802ea41479958cc358c673b7b10a60320e78046de7e555d972cd821fe98692\" returns successfully" Jan 19 09:28:44.227284 kubelet[2926]: E0119 09:28:44.227187 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:28:44.230648 kubelet[2926]: E0119 09:28:44.230093 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:28:46.224812 kubelet[2926]: E0119 09:28:46.223564 2926 controller.go:195] "Failed to update lease" err="Put \"https://77.42.19.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-637e605ed7?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 19 09:28:47.225053 kubelet[2926]: E0119 09:28:47.224967 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:28:48.226107 kubelet[2926]: E0119 09:28:48.225807 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:28:48.227108 kubelet[2926]: E0119 09:28:48.226799 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:28:49.002768 systemd[1]: cri-containerd-64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906.scope: Deactivated successfully. Jan 19 09:28:49.005678 containerd[1676]: time="2026-01-19T09:28:49.005610752Z" level=info msg="received container exit event container_id:\"64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906\" id:\"64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906\" pid:5495 exit_status:1 exited_at:{seconds:1768814929 nanos:5139635}" Jan 19 09:28:49.007251 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 19 09:28:49.007337 kernel: audit: type=1334 audit(1768814929.005:908): prog-id=267 op=UNLOAD Jan 19 09:28:49.005000 audit: BPF prog-id=267 op=UNLOAD Jan 19 09:28:49.005000 audit: BPF prog-id=271 op=UNLOAD Jan 19 09:28:49.010633 kernel: audit: type=1334 audit(1768814929.005:909): prog-id=271 op=UNLOAD Jan 19 09:28:49.050044 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906-rootfs.mount: Deactivated successfully. Jan 19 09:28:49.225537 kubelet[2926]: E0119 09:28:49.225446 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:28:49.805547 kubelet[2926]: I0119 09:28:49.805476 2926 scope.go:117] "RemoveContainer" containerID="8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22" Jan 19 09:28:49.806230 kubelet[2926]: I0119 09:28:49.805745 2926 scope.go:117] "RemoveContainer" containerID="64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906" Jan 19 09:28:49.806230 kubelet[2926]: E0119 09:28:49.806185 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-65cdcdfd6d-b5zsr_tigera-operator(7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927)\"" pod="tigera-operator/tigera-operator-65cdcdfd6d-b5zsr" podUID="7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927" Jan 19 09:28:49.807110 containerd[1676]: time="2026-01-19T09:28:49.807073868Z" level=info msg="RemoveContainer for \"8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22\"" Jan 19 09:28:49.812905 containerd[1676]: time="2026-01-19T09:28:49.812795921Z" level=info msg="RemoveContainer for \"8a3af622a5566d9444f2d8c9008d8699d252ba92f881173d544f685d43f9dd22\" returns successfully" Jan 19 09:28:52.225364 kubelet[2926]: E0119 09:28:52.225260 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:28:56.224952 kubelet[2926]: E0119 09:28:56.224699 2926 controller.go:195] "Failed to update lease" err="Put \"https://77.42.19.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-637e605ed7?timeout=10s\": context deadline exceeded" Jan 19 09:28:58.225766 kubelet[2926]: E0119 09:28:58.225647 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:28:58.226420 kubelet[2926]: E0119 09:28:58.226142 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:28:59.225855 kubelet[2926]: E0119 09:28:59.225776 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:29:00.227215 kubelet[2926]: E0119 09:29:00.227044 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:29:02.226178 kubelet[2926]: E0119 09:29:02.226082 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:29:02.233931 kubelet[2926]: E0119 09:29:02.231155 2926 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{calico-apiserver-67d5ff6dcb-bnwh7.188c17b040fec110 calico-apiserver 1605 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-67d5ff6dcb-bnwh7,UID:911306cd-bd01-4287-9129-4bbfd31bbd16,APIVersion:v1,ResourceVersion:843,FieldPath:spec.containers{calico-apiserver},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4580-0-0-p-637e605ed7,},FirstTimestamp:2026-01-19 09:26:39 +0000 UTC,LastTimestamp:2026-01-19 09:28:28.226874311 +0000 UTC m=+158.123693102,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4580-0-0-p-637e605ed7,}" Jan 19 09:29:03.225477 kubelet[2926]: E0119 09:29:03.225430 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:29:04.228096 kubelet[2926]: I0119 09:29:04.228036 2926 scope.go:117] "RemoveContainer" containerID="64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906" Jan 19 09:29:04.229288 kubelet[2926]: E0119 09:29:04.229236 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:29:04.231221 containerd[1676]: time="2026-01-19T09:29:04.231162690Z" level=info msg="CreateContainer within sandbox \"e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Jan 19 09:29:04.243423 containerd[1676]: time="2026-01-19T09:29:04.242856841Z" level=info msg="Container 0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c: CDI devices from CRI Config.CDIDevices: []" Jan 19 09:29:04.255479 containerd[1676]: time="2026-01-19T09:29:04.255357962Z" level=info msg="CreateContainer within sandbox \"e9866da1a39955c0d1deca5238720e49e3764c91c409ebc27d9b2bd7d3c644bf\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c\"" Jan 19 09:29:04.255974 containerd[1676]: time="2026-01-19T09:29:04.255920834Z" level=info msg="StartContainer for \"0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c\"" Jan 19 09:29:04.256932 containerd[1676]: time="2026-01-19T09:29:04.256904325Z" level=info msg="connecting to shim 0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c" address="unix:///run/containerd/s/c663bd7e7f7ace4214b4e6ba1b09322776babf8109134ce90184dfd1ca37d5d0" protocol=ttrpc version=3 Jan 19 09:29:04.295835 systemd[1]: Started cri-containerd-0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c.scope - libcontainer container 0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c. Jan 19 09:29:04.323000 audit: BPF prog-id=278 op=LOAD Jan 19 09:29:04.329457 kernel: audit: type=1334 audit(1768814944.323:910): prog-id=278 op=LOAD Jan 19 09:29:04.328000 audit: BPF prog-id=279 op=LOAD Jan 19 09:29:04.328000 audit[5601]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.337886 kernel: audit: type=1334 audit(1768814944.328:911): prog-id=279 op=LOAD Jan 19 09:29:04.338036 kernel: audit: type=1300 audit(1768814944.328:911): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.328000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.353246 kernel: audit: type=1327 audit(1768814944.328:911): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.328000 audit: BPF prog-id=279 op=UNLOAD Jan 19 09:29:04.364542 kernel: audit: type=1334 audit(1768814944.328:912): prog-id=279 op=UNLOAD Jan 19 09:29:04.328000 audit[5601]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.381462 kernel: audit: type=1300 audit(1768814944.328:912): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.328000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.399607 kernel: audit: type=1327 audit(1768814944.328:912): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.399770 kernel: audit: type=1334 audit(1768814944.328:913): prog-id=280 op=LOAD Jan 19 09:29:04.328000 audit: BPF prog-id=280 op=LOAD Jan 19 09:29:04.328000 audit[5601]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.416465 kernel: audit: type=1300 audit(1768814944.328:913): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.430534 kernel: audit: type=1327 audit(1768814944.328:913): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.328000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.329000 audit: BPF prog-id=281 op=LOAD Jan 19 09:29:04.329000 audit[5601]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.329000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.329000 audit: BPF prog-id=281 op=UNLOAD Jan 19 09:29:04.329000 audit[5601]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.329000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.329000 audit: BPF prog-id=280 op=UNLOAD Jan 19 09:29:04.329000 audit[5601]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.329000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.329000 audit: BPF prog-id=282 op=LOAD Jan 19 09:29:04.329000 audit[5601]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3063 pid=5601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 19 09:29:04.329000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064656262313338663666316134393835323437643765303966313734 Jan 19 09:29:04.439778 containerd[1676]: time="2026-01-19T09:29:04.439721713Z" level=info msg="StartContainer for \"0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c\" returns successfully" Jan 19 09:29:06.225785 kubelet[2926]: E0119 09:29:06.225678 2926 request.go:1196] "Unexpected error when reading response body" err="net/http: request canceled (Client.Timeout or context cancellation while reading body)" Jan 19 09:29:06.225785 kubelet[2926]: E0119 09:29:06.225775 2926 controller.go:195] "Failed to update lease" err="unexpected error when reading response body. Please retry. Original error: net/http: request canceled (Client.Timeout or context cancellation while reading body)" Jan 19 09:29:12.225350 kubelet[2926]: E0119 09:29:12.225252 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:29:13.225522 kubelet[2926]: E0119 09:29:13.225467 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:29:14.227985 kubelet[2926]: E0119 09:29:14.227720 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3" Jan 19 09:29:14.227985 kubelet[2926]: E0119 09:29:14.227890 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:29:14.230616 kubelet[2926]: E0119 09:29:14.230518 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:29:15.224804 kubelet[2926]: E0119 09:29:15.224767 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:29:15.615033 systemd[1]: cri-containerd-0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c.scope: Deactivated successfully. Jan 19 09:29:15.619442 containerd[1676]: time="2026-01-19T09:29:15.619344780Z" level=info msg="received container exit event container_id:\"0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c\" id:\"0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c\" pid:5613 exit_status:1 exited_at:{seconds:1768814955 nanos:618613239}" Jan 19 09:29:15.621000 audit: BPF prog-id=278 op=UNLOAD Jan 19 09:29:15.622841 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 19 09:29:15.622981 kernel: audit: type=1334 audit(1768814955.621:918): prog-id=278 op=UNLOAD Jan 19 09:29:15.621000 audit: BPF prog-id=282 op=UNLOAD Jan 19 09:29:15.630145 kernel: audit: type=1334 audit(1768814955.621:919): prog-id=282 op=UNLOAD Jan 19 09:29:15.670470 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c-rootfs.mount: Deactivated successfully. Jan 19 09:29:15.876598 kubelet[2926]: I0119 09:29:15.876439 2926 scope.go:117] "RemoveContainer" containerID="64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906" Jan 19 09:29:15.877308 kubelet[2926]: I0119 09:29:15.876852 2926 scope.go:117] "RemoveContainer" containerID="0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c" Jan 19 09:29:15.877308 kubelet[2926]: E0119 09:29:15.877013 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=tigera-operator pod=tigera-operator-65cdcdfd6d-b5zsr_tigera-operator(7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927)\"" pod="tigera-operator/tigera-operator-65cdcdfd6d-b5zsr" podUID="7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927" Jan 19 09:29:15.879744 containerd[1676]: time="2026-01-19T09:29:15.879707844Z" level=info msg="RemoveContainer for \"64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906\"" Jan 19 09:29:15.893040 containerd[1676]: time="2026-01-19T09:29:15.892965787Z" level=info msg="RemoveContainer for \"64966546d66dfcc7737cdd047a9933d5e9580733601399e0daf4c8905e782906\" returns successfully" Jan 19 09:29:16.226314 kubelet[2926]: E0119 09:29:16.225944 2926 controller.go:195] "Failed to update lease" err="Put \"https://77.42.19.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-637e605ed7?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 19 09:29:16.226314 kubelet[2926]: I0119 09:29:16.225978 2926 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 19 09:29:19.225270 kubelet[2926]: E0119 09:29:19.225187 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" Jan 19 09:29:25.224998 kubelet[2926]: E0119 09:29:25.224934 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-sg9sb" podUID="4ce7f23f-45be-4728-a485-c7ae56af8a5c" Jan 19 09:29:26.229012 kubelet[2926]: E0119 09:29:26.228673 2926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.19.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-637e605ed7?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Jan 19 09:29:26.231071 containerd[1676]: time="2026-01-19T09:29:26.230668179Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 19 09:29:26.233087 kubelet[2926]: E0119 09:29:26.233007 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-km4zx" podUID="cf97e1c1-8e02-43e7-988e-adabcb45ce04" Jan 19 09:29:26.674765 containerd[1676]: time="2026-01-19T09:29:26.674707368Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:29:26.676423 containerd[1676]: time="2026-01-19T09:29:26.676369009Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 19 09:29:26.676492 containerd[1676]: time="2026-01-19T09:29:26.676472968Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 19 09:29:26.676729 kubelet[2926]: E0119 09:29:26.676667 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:29:26.676801 kubelet[2926]: E0119 09:29:26.676733 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 19 09:29:26.676891 kubelet[2926]: E0119 09:29:26.676862 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 19 09:29:26.678298 containerd[1676]: time="2026-01-19T09:29:26.678227630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 19 09:29:27.112793 containerd[1676]: time="2026-01-19T09:29:27.112690140Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:29:27.114280 containerd[1676]: time="2026-01-19T09:29:27.114225851Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 19 09:29:27.114377 containerd[1676]: time="2026-01-19T09:29:27.114327430Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 19 09:29:27.114659 kubelet[2926]: E0119 09:29:27.114603 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:29:27.114704 kubelet[2926]: E0119 09:29:27.114671 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 19 09:29:27.114780 kubelet[2926]: E0119 09:29:27.114757 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-5cb97b98cc-snlsk_calico-system(61f2280f-1a00-4605-a568-ed8af0fdcb06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 19 09:29:27.114827 kubelet[2926]: E0119 09:29:27.114802 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cb97b98cc-snlsk" podUID="61f2280f-1a00-4605-a568-ed8af0fdcb06" Jan 19 09:29:28.224742 kubelet[2926]: I0119 09:29:28.224454 2926 scope.go:117] "RemoveContainer" containerID="0debb138f6f1a4985247d7e09f174c5313f84cb46e289bbc01a57f6c104d172c" Jan 19 09:29:28.224742 kubelet[2926]: E0119 09:29:28.224697 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=tigera-operator pod=tigera-operator-65cdcdfd6d-b5zsr_tigera-operator(7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927)\"" pod="tigera-operator/tigera-operator-65cdcdfd6d-b5zsr" podUID="7d1cbfd7-8a3f-4d2d-a0ff-0d5c26a16927" Jan 19 09:29:28.226494 kubelet[2926]: E0119 09:29:28.226455 2926 status_manager.go:1018] "Failed to get status for pod" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods calico-apiserver-67d5ff6dcb-bnwh7)" podUID="911306cd-bd01-4287-9129-4bbfd31bbd16" pod="calico-apiserver/calico-apiserver-67d5ff6dcb-bnwh7" Jan 19 09:29:29.227039 kubelet[2926]: E0119 09:29:29.226923 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6bcfd67b5c-6xwgh" podUID="c1023597-3f97-435e-a787-6c49b8480f62" Jan 19 09:29:29.228280 containerd[1676]: time="2026-01-19T09:29:29.227216056Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 19 09:29:29.655611 containerd[1676]: time="2026-01-19T09:29:29.655548873Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:29:29.657102 containerd[1676]: time="2026-01-19T09:29:29.657059103Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 19 09:29:29.657169 containerd[1676]: time="2026-01-19T09:29:29.657127704Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 19 09:29:29.657314 kubelet[2926]: E0119 09:29:29.657271 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:29:29.657371 kubelet[2926]: E0119 09:29:29.657314 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 19 09:29:29.657690 kubelet[2926]: E0119 09:29:29.657544 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-558fc744bc-cf5rv_calico-apiserver(37c84e8a-a62d-48e6-91f5-791aabccbba8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 19 09:29:29.657690 kubelet[2926]: E0119 09:29:29.657650 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558fc744bc-cf5rv" podUID="37c84e8a-a62d-48e6-91f5-791aabccbba8" Jan 19 09:29:29.657774 containerd[1676]: time="2026-01-19T09:29:29.657650414Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 19 09:29:30.107578 containerd[1676]: time="2026-01-19T09:29:30.107534670Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 19 09:29:30.109021 containerd[1676]: time="2026-01-19T09:29:30.108864060Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 19 09:29:30.109021 containerd[1676]: time="2026-01-19T09:29:30.108938160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 19 09:29:30.109175 kubelet[2926]: E0119 09:29:30.109087 2926 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:29:30.109175 kubelet[2926]: E0119 09:29:30.109120 2926 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 19 09:29:30.109268 kubelet[2926]: E0119 09:29:30.109205 2926 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-l9tzs_calico-system(d4703ae1-6e1e-4185-acf2-7e68ee8729a3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 19 09:29:30.109307 kubelet[2926]: E0119 09:29:30.109266 2926 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-l9tzs" podUID="d4703ae1-6e1e-4185-acf2-7e68ee8729a3"