Jan 23 18:30:27.881234 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Fri Jan 23 15:50:57 -00 2026 Jan 23 18:30:27.881266 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ee2a61adbfdca0d8850a6d1564f6a5daa8e67e4645be01ed76a79270fe7c1051 Jan 23 18:30:27.881280 kernel: BIOS-provided physical RAM map: Jan 23 18:30:27.881291 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 23 18:30:27.881305 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Jan 23 18:30:27.881314 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Jan 23 18:30:27.881326 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Jan 23 18:30:27.881336 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 23 18:30:27.881345 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 23 18:30:27.881371 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 23 18:30:27.881381 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Jan 23 18:30:27.881390 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Jan 23 18:30:27.881404 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 23 18:30:27.881414 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 23 18:30:27.881426 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 23 18:30:27.881436 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Jan 23 18:30:27.881446 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jan 23 18:30:27.881461 kernel: NX (Execute Disable) protection: active Jan 23 18:30:27.881471 kernel: APIC: Static calls initialized Jan 23 18:30:27.881481 kernel: e820: update [mem 0x7dfab018-0x7dfb4a57] usable ==> usable Jan 23 18:30:27.881492 kernel: e820: update [mem 0x7df6f018-0x7dfaa657] usable ==> usable Jan 23 18:30:27.882498 kernel: e820: update [mem 0x7dc01018-0x7dc3c657] usable ==> usable Jan 23 18:30:27.882511 kernel: extended physical RAM map: Jan 23 18:30:27.882522 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 23 18:30:27.882532 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000007dc01017] usable Jan 23 18:30:27.882542 kernel: reserve setup_data: [mem 0x000000007dc01018-0x000000007dc3c657] usable Jan 23 18:30:27.882553 kernel: reserve setup_data: [mem 0x000000007dc3c658-0x000000007df6f017] usable Jan 23 18:30:27.882568 kernel: reserve setup_data: [mem 0x000000007df6f018-0x000000007dfaa657] usable Jan 23 18:30:27.882579 kernel: reserve setup_data: [mem 0x000000007dfaa658-0x000000007dfab017] usable Jan 23 18:30:27.882643 kernel: reserve setup_data: [mem 0x000000007dfab018-0x000000007dfb4a57] usable Jan 23 18:30:27.882655 kernel: reserve setup_data: [mem 0x000000007dfb4a58-0x000000007ed3efff] usable Jan 23 18:30:27.882666 kernel: reserve setup_data: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Jan 23 18:30:27.882677 kernel: reserve setup_data: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Jan 23 18:30:27.882687 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 23 18:30:27.882698 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 23 18:30:27.882708 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 23 18:30:27.882718 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Jan 23 18:30:27.882729 kernel: reserve setup_data: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Jan 23 18:30:27.882744 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 23 18:30:27.882755 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 23 18:30:27.882771 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 23 18:30:27.882782 kernel: reserve setup_data: [mem 0x0000000100000000-0x0000000179ffffff] usable Jan 23 18:30:27.882797 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jan 23 18:30:27.882808 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jan 23 18:30:27.882820 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e01b198 RNG=0x7fb73018 Jan 23 18:30:27.882831 kernel: random: crng init done Jan 23 18:30:27.882842 kernel: efi: Remove mem137: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 23 18:30:27.882853 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 23 18:30:27.882864 kernel: secureboot: Secure boot disabled Jan 23 18:30:27.882874 kernel: SMBIOS 3.0.0 present. Jan 23 18:30:27.882885 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Jan 23 18:30:27.882900 kernel: DMI: Memory slots populated: 1/1 Jan 23 18:30:27.882911 kernel: Hypervisor detected: KVM Jan 23 18:30:27.882921 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Jan 23 18:30:27.882932 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 23 18:30:27.882943 kernel: kvm-clock: using sched offset of 13334090015 cycles Jan 23 18:30:27.882954 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 23 18:30:27.882966 kernel: tsc: Detected 2399.998 MHz processor Jan 23 18:30:27.882978 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 23 18:30:27.882990 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 23 18:30:27.883001 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Jan 23 18:30:27.883017 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 23 18:30:27.883029 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 23 18:30:27.883040 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Jan 23 18:30:27.883051 kernel: Using GB pages for direct mapping Jan 23 18:30:27.883062 kernel: ACPI: Early table checksum verification disabled Jan 23 18:30:27.883074 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 23 18:30:27.883085 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Jan 23 18:30:27.883101 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:30:27.883112 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:30:27.883124 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 23 18:30:27.883135 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:30:27.883146 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:30:27.883158 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:30:27.883169 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 18:30:27.883184 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 23 18:30:27.883196 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Jan 23 18:30:27.883207 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Jan 23 18:30:27.883218 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 23 18:30:27.883229 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Jan 23 18:30:27.883240 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Jan 23 18:30:27.883251 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Jan 23 18:30:27.883267 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Jan 23 18:30:27.883278 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Jan 23 18:30:27.883289 kernel: No NUMA configuration found Jan 23 18:30:27.883300 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Jan 23 18:30:27.883312 kernel: NODE_DATA(0) allocated [mem 0x179ff8dc0-0x179ffffff] Jan 23 18:30:27.883323 kernel: Zone ranges: Jan 23 18:30:27.883335 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 23 18:30:27.883346 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 23 18:30:27.883374 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Jan 23 18:30:27.883386 kernel: Device empty Jan 23 18:30:27.883397 kernel: Movable zone start for each node Jan 23 18:30:27.883408 kernel: Early memory node ranges Jan 23 18:30:27.883419 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 23 18:30:27.883430 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Jan 23 18:30:27.883441 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Jan 23 18:30:27.883452 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Jan 23 18:30:27.883468 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Jan 23 18:30:27.883479 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Jan 23 18:30:27.883490 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 23 18:30:27.883501 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 23 18:30:27.883512 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 23 18:30:27.883523 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 23 18:30:27.883535 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Jan 23 18:30:27.883551 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jan 23 18:30:27.883562 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 23 18:30:27.883574 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 23 18:30:27.883585 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 23 18:30:27.883596 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 23 18:30:27.883630 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 23 18:30:27.883641 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 23 18:30:27.883657 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 23 18:30:27.883668 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 23 18:30:27.883679 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 23 18:30:27.883690 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Jan 23 18:30:27.883702 kernel: CPU topo: Max. logical packages: 1 Jan 23 18:30:27.883713 kernel: CPU topo: Max. logical dies: 1 Jan 23 18:30:27.883741 kernel: CPU topo: Max. dies per package: 1 Jan 23 18:30:27.883753 kernel: CPU topo: Max. threads per core: 1 Jan 23 18:30:27.883765 kernel: CPU topo: Num. cores per package: 2 Jan 23 18:30:27.883776 kernel: CPU topo: Num. threads per package: 2 Jan 23 18:30:27.883792 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 23 18:30:27.883804 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 23 18:30:27.883815 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 23 18:30:27.883827 kernel: Booting paravirtualized kernel on KVM Jan 23 18:30:27.883840 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 23 18:30:27.883856 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 23 18:30:27.883868 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 23 18:30:27.883880 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 23 18:30:27.883891 kernel: pcpu-alloc: [0] 0 1 Jan 23 18:30:27.883903 kernel: kvm-guest: PV spinlocks disabled, no host support Jan 23 18:30:27.883916 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ee2a61adbfdca0d8850a6d1564f6a5daa8e67e4645be01ed76a79270fe7c1051 Jan 23 18:30:27.883932 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 23 18:30:27.883944 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 23 18:30:27.883956 kernel: Fallback order for Node 0: 0 Jan 23 18:30:27.883968 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1022792 Jan 23 18:30:27.883979 kernel: Policy zone: Normal Jan 23 18:30:27.883991 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 23 18:30:27.884002 kernel: software IO TLB: area num 2. Jan 23 18:30:27.884018 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 23 18:30:27.884030 kernel: ftrace: allocating 40097 entries in 157 pages Jan 23 18:30:27.884041 kernel: ftrace: allocated 157 pages with 5 groups Jan 23 18:30:27.884053 kernel: Dynamic Preempt: voluntary Jan 23 18:30:27.884064 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 23 18:30:27.884083 kernel: rcu: RCU event tracing is enabled. Jan 23 18:30:27.884096 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 23 18:30:27.884108 kernel: Trampoline variant of Tasks RCU enabled. Jan 23 18:30:27.884124 kernel: Rude variant of Tasks RCU enabled. Jan 23 18:30:27.884136 kernel: Tracing variant of Tasks RCU enabled. Jan 23 18:30:27.884147 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 23 18:30:27.884159 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 23 18:30:27.884171 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 23 18:30:27.884183 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 23 18:30:27.884200 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 23 18:30:27.884216 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 23 18:30:27.884228 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 23 18:30:27.884239 kernel: Console: colour dummy device 80x25 Jan 23 18:30:27.884251 kernel: printk: legacy console [tty0] enabled Jan 23 18:30:27.884263 kernel: printk: legacy console [ttyS0] enabled Jan 23 18:30:27.884275 kernel: ACPI: Core revision 20240827 Jan 23 18:30:27.884286 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Jan 23 18:30:27.884302 kernel: APIC: Switch to symmetric I/O mode setup Jan 23 18:30:27.884314 kernel: x2apic enabled Jan 23 18:30:27.884326 kernel: APIC: Switched APIC routing to: physical x2apic Jan 23 18:30:27.884338 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 23 18:30:27.884361 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jan 23 18:30:27.884373 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Jan 23 18:30:27.884385 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 23 18:30:27.884400 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Jan 23 18:30:27.884412 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Jan 23 18:30:27.884424 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 23 18:30:27.884436 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 23 18:30:27.884448 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 23 18:30:27.884459 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 23 18:30:27.884471 kernel: active return thunk: srso_alias_return_thunk Jan 23 18:30:27.884487 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Jan 23 18:30:27.884499 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Jan 23 18:30:27.884510 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Jan 23 18:30:27.884522 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 23 18:30:27.884534 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 23 18:30:27.884545 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 23 18:30:27.884557 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 23 18:30:27.884572 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 23 18:30:27.884584 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 23 18:30:27.884596 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 23 18:30:27.884771 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 23 18:30:27.884783 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 23 18:30:27.884795 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 23 18:30:27.884807 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 23 18:30:27.884824 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 23 18:30:27.884836 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 23 18:30:27.884848 kernel: Freeing SMP alternatives memory: 32K Jan 23 18:30:27.884859 kernel: pid_max: default: 32768 minimum: 301 Jan 23 18:30:27.884871 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 23 18:30:27.884883 kernel: landlock: Up and running. Jan 23 18:30:27.884894 kernel: SELinux: Initializing. Jan 23 18:30:27.884910 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 23 18:30:27.884922 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 23 18:30:27.884934 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Jan 23 18:30:27.884946 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Jan 23 18:30:27.884957 kernel: ... version: 0 Jan 23 18:30:27.884969 kernel: ... bit width: 48 Jan 23 18:30:27.884981 kernel: ... generic registers: 6 Jan 23 18:30:27.884992 kernel: ... value mask: 0000ffffffffffff Jan 23 18:30:27.885008 kernel: ... max period: 00007fffffffffff Jan 23 18:30:27.885020 kernel: ... fixed-purpose events: 0 Jan 23 18:30:27.885032 kernel: ... event mask: 000000000000003f Jan 23 18:30:27.887676 kernel: signal: max sigframe size: 3376 Jan 23 18:30:27.887693 kernel: rcu: Hierarchical SRCU implementation. Jan 23 18:30:27.887707 kernel: rcu: Max phase no-delay instances is 400. Jan 23 18:30:27.887719 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 23 18:30:27.887737 kernel: smp: Bringing up secondary CPUs ... Jan 23 18:30:27.887749 kernel: smpboot: x86: Booting SMP configuration: Jan 23 18:30:27.887761 kernel: .... node #0, CPUs: #1 Jan 23 18:30:27.887772 kernel: smp: Brought up 1 node, 2 CPUs Jan 23 18:30:27.887784 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Jan 23 18:30:27.887797 kernel: Memory: 3873092K/4091168K available (14336K kernel code, 2445K rwdata, 31636K rodata, 15532K init, 2508K bss, 212440K reserved, 0K cma-reserved) Jan 23 18:30:27.887809 kernel: devtmpfs: initialized Jan 23 18:30:27.887825 kernel: x86/mm: Memory block size: 128MB Jan 23 18:30:27.887837 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 23 18:30:27.887849 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 23 18:30:27.887861 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 23 18:30:27.887873 kernel: pinctrl core: initialized pinctrl subsystem Jan 23 18:30:27.887885 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 23 18:30:27.887897 kernel: audit: initializing netlink subsys (disabled) Jan 23 18:30:27.887912 kernel: audit: type=2000 audit(1769193023.164:1): state=initialized audit_enabled=0 res=1 Jan 23 18:30:27.887925 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 23 18:30:27.887936 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 23 18:30:27.887948 kernel: cpuidle: using governor menu Jan 23 18:30:27.887960 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 23 18:30:27.887972 kernel: dca service started, version 1.12.1 Jan 23 18:30:27.887984 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 23 18:30:27.888000 kernel: PCI: Using configuration type 1 for base access Jan 23 18:30:27.888012 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 23 18:30:27.888023 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 23 18:30:27.888035 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 23 18:30:27.888047 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 23 18:30:27.888059 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 23 18:30:27.888071 kernel: ACPI: Added _OSI(Module Device) Jan 23 18:30:27.888086 kernel: ACPI: Added _OSI(Processor Device) Jan 23 18:30:27.888098 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 23 18:30:27.888110 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 23 18:30:27.888122 kernel: ACPI: Interpreter enabled Jan 23 18:30:27.888133 kernel: ACPI: PM: (supports S0 S5) Jan 23 18:30:27.888152 kernel: ACPI: Using IOAPIC for interrupt routing Jan 23 18:30:27.888164 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 23 18:30:27.888179 kernel: PCI: Using E820 reservations for host bridge windows Jan 23 18:30:27.888191 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 23 18:30:27.888203 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 23 18:30:27.888623 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 23 18:30:27.888931 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 23 18:30:27.889229 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 23 18:30:27.889250 kernel: PCI host bridge to bus 0000:00 Jan 23 18:30:27.889547 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 23 18:30:27.890253 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 23 18:30:27.891896 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 23 18:30:27.892177 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 23 18:30:27.892456 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 23 18:30:27.892749 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Jan 23 18:30:27.893011 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 23 18:30:27.893318 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 23 18:30:27.893682 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 23 18:30:27.894963 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 23 18:30:27.895287 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc060500000-0xc060503fff 64bit pref] Jan 23 18:30:27.895592 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8138a000-0x8138afff] Jan 23 18:30:27.895899 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 23 18:30:27.896189 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 23 18:30:27.896495 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:30:27.897818 kernel: pci 0000:00:02.0: BAR 0 [mem 0x81389000-0x81389fff] Jan 23 18:30:27.898112 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 23 18:30:27.898411 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Jan 23 18:30:27.898721 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 23 18:30:27.899019 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:30:27.899300 kernel: pci 0000:00:02.1: BAR 0 [mem 0x81388000-0x81388fff] Jan 23 18:30:27.900876 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 23 18:30:27.901208 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Jan 23 18:30:27.901525 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:30:27.901837 kernel: pci 0000:00:02.2: BAR 0 [mem 0x81387000-0x81387fff] Jan 23 18:30:27.902117 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 23 18:30:27.902410 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Jan 23 18:30:27.903046 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 23 18:30:27.903367 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:30:27.903676 kernel: pci 0000:00:02.3: BAR 0 [mem 0x81386000-0x81386fff] Jan 23 18:30:27.903957 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 23 18:30:27.906248 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 23 18:30:27.906583 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:30:27.907235 kernel: pci 0000:00:02.4: BAR 0 [mem 0x81385000-0x81385fff] Jan 23 18:30:27.907545 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 23 18:30:27.907811 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Jan 23 18:30:27.907952 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 23 18:30:27.908099 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:30:27.908240 kernel: pci 0000:00:02.5: BAR 0 [mem 0x81384000-0x81384fff] Jan 23 18:30:27.908393 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 23 18:30:27.908534 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Jan 23 18:30:27.908814 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 23 18:30:27.908987 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:30:27.909131 kernel: pci 0000:00:02.6: BAR 0 [mem 0x81383000-0x81383fff] Jan 23 18:30:27.909317 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 23 18:30:27.909471 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Jan 23 18:30:27.909622 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 23 18:30:27.909770 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:30:27.909909 kernel: pci 0000:00:02.7: BAR 0 [mem 0x81382000-0x81382fff] Jan 23 18:30:27.910049 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 23 18:30:27.910192 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Jan 23 18:30:27.910333 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 23 18:30:27.910493 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 18:30:27.910648 kernel: pci 0000:00:03.0: BAR 0 [mem 0x81381000-0x81381fff] Jan 23 18:30:27.910793 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 23 18:30:27.910932 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Jan 23 18:30:27.911074 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 23 18:30:27.911220 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 23 18:30:27.911366 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 23 18:30:27.911514 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 23 18:30:27.911663 kernel: pci 0000:00:1f.2: BAR 4 [io 0x6040-0x605f] Jan 23 18:30:27.911806 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x81380000-0x81380fff] Jan 23 18:30:27.911950 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 23 18:30:27.912089 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6000-0x603f] Jan 23 18:30:27.912241 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 23 18:30:27.912393 kernel: pci 0000:01:00.0: BAR 1 [mem 0x81200000-0x81200fff] Jan 23 18:30:27.912538 kernel: pci 0000:01:00.0: BAR 4 [mem 0xc060000000-0xc060003fff 64bit pref] Jan 23 18:30:27.912699 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 23 18:30:27.912838 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 23 18:30:27.912990 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 23 18:30:27.913132 kernel: pci 0000:02:00.0: BAR 0 [mem 0x81100000-0x81103fff 64bit] Jan 23 18:30:27.913301 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 23 18:30:27.913460 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jan 23 18:30:27.913639 kernel: pci 0000:03:00.0: BAR 1 [mem 0x81000000-0x81000fff] Jan 23 18:30:27.913813 kernel: pci 0000:03:00.0: BAR 4 [mem 0xc060100000-0xc060103fff 64bit pref] Jan 23 18:30:27.913954 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 23 18:30:27.914135 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 23 18:30:27.914281 kernel: pci 0000:04:00.0: BAR 4 [mem 0xc060200000-0xc060203fff 64bit pref] Jan 23 18:30:27.914434 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 23 18:30:27.914586 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 23 18:30:27.914837 kernel: pci 0000:05:00.0: BAR 1 [mem 0x80f00000-0x80f00fff] Jan 23 18:30:27.915043 kernel: pci 0000:05:00.0: BAR 4 [mem 0xc060300000-0xc060303fff 64bit pref] Jan 23 18:30:27.915185 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 23 18:30:27.915337 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jan 23 18:30:27.915492 kernel: pci 0000:06:00.0: BAR 1 [mem 0x80e00000-0x80e00fff] Jan 23 18:30:27.915652 kernel: pci 0000:06:00.0: BAR 4 [mem 0xc060400000-0xc060403fff 64bit pref] Jan 23 18:30:27.915792 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 23 18:30:27.915799 kernel: acpiphp: Slot [0] registered Jan 23 18:30:27.915952 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 23 18:30:27.916099 kernel: pci 0000:07:00.0: BAR 1 [mem 0x80c00000-0x80c00fff] Jan 23 18:30:27.916248 kernel: pci 0000:07:00.0: BAR 4 [mem 0xc000000000-0xc000003fff 64bit pref] Jan 23 18:30:27.916399 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 23 18:30:27.916541 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 23 18:30:27.916548 kernel: acpiphp: Slot [0-2] registered Jan 23 18:30:27.916702 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 23 18:30:27.916710 kernel: acpiphp: Slot [0-3] registered Jan 23 18:30:27.916853 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 23 18:30:27.916870 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 23 18:30:27.916889 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 23 18:30:27.916897 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 23 18:30:27.916903 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 23 18:30:27.916910 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 23 18:30:27.916916 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 23 18:30:27.916925 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 23 18:30:27.916931 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 23 18:30:27.916937 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 23 18:30:27.916943 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 23 18:30:27.916950 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 23 18:30:27.916956 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 23 18:30:27.916962 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 23 18:30:27.916971 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 23 18:30:27.916978 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 23 18:30:27.916986 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 23 18:30:27.916992 kernel: iommu: Default domain type: Translated Jan 23 18:30:27.917000 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 23 18:30:27.917007 kernel: efivars: Registered efivars operations Jan 23 18:30:27.917013 kernel: PCI: Using ACPI for IRQ routing Jan 23 18:30:27.917019 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 23 18:30:27.917026 kernel: e820: reserve RAM buffer [mem 0x7dc01018-0x7fffffff] Jan 23 18:30:27.917032 kernel: e820: reserve RAM buffer [mem 0x7df6f018-0x7fffffff] Jan 23 18:30:27.917038 kernel: e820: reserve RAM buffer [mem 0x7dfab018-0x7fffffff] Jan 23 18:30:27.917047 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Jan 23 18:30:27.917053 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 23 18:30:27.917059 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Jan 23 18:30:27.917066 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Jan 23 18:30:27.917205 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 23 18:30:27.917345 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 23 18:30:27.917492 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 23 18:30:27.917502 kernel: vgaarb: loaded Jan 23 18:30:27.917508 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Jan 23 18:30:27.917515 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Jan 23 18:30:27.917521 kernel: clocksource: Switched to clocksource kvm-clock Jan 23 18:30:27.917528 kernel: VFS: Disk quotas dquot_6.6.0 Jan 23 18:30:27.917535 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 23 18:30:27.917541 kernel: pnp: PnP ACPI init Jan 23 18:30:27.917713 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 23 18:30:27.917723 kernel: pnp: PnP ACPI: found 5 devices Jan 23 18:30:27.917729 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 23 18:30:27.917735 kernel: NET: Registered PF_INET protocol family Jan 23 18:30:27.917744 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 23 18:30:27.917751 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 23 18:30:27.917757 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 23 18:30:27.917766 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 23 18:30:27.917772 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 23 18:30:27.917778 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 23 18:30:27.917785 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 23 18:30:27.917791 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 23 18:30:27.917797 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 23 18:30:27.917804 kernel: NET: Registered PF_XDP protocol family Jan 23 18:30:27.917952 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 23 18:30:27.918099 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 23 18:30:27.918241 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 23 18:30:27.918387 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 23 18:30:27.918526 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 23 18:30:27.918680 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Jan 23 18:30:27.918827 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Jan 23 18:30:27.919005 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Jan 23 18:30:27.919150 kernel: pci 0000:01:00.0: ROM [mem 0x81280000-0x812fffff pref]: assigned Jan 23 18:30:27.919292 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 23 18:30:27.919473 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Jan 23 18:30:27.919628 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 23 18:30:27.919768 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 23 18:30:27.919912 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Jan 23 18:30:27.920051 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 23 18:30:27.920189 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Jan 23 18:30:27.920327 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 23 18:30:27.920476 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 23 18:30:27.920626 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 23 18:30:27.920769 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 23 18:30:27.920911 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Jan 23 18:30:27.921049 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 23 18:30:27.921187 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 23 18:30:27.921325 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Jan 23 18:30:27.921472 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 23 18:30:27.921626 kernel: pci 0000:07:00.0: ROM [mem 0x80c80000-0x80cfffff pref]: assigned Jan 23 18:30:27.921764 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 23 18:30:27.921905 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Jan 23 18:30:27.922044 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Jan 23 18:30:27.922183 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 23 18:30:27.922339 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 23 18:30:27.922488 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Jan 23 18:30:27.922641 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Jan 23 18:30:27.922784 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 23 18:30:27.922941 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 23 18:30:27.923081 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Jan 23 18:30:27.923219 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Jan 23 18:30:27.923367 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 23 18:30:27.923507 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 23 18:30:27.923676 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 23 18:30:27.923846 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 23 18:30:27.923979 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 23 18:30:27.924112 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 23 18:30:27.924244 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Jan 23 18:30:27.924497 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Jan 23 18:30:27.924657 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Jan 23 18:30:27.924804 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Jan 23 18:30:27.924944 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Jan 23 18:30:27.925479 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Jan 23 18:30:27.925641 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Jan 23 18:30:27.925793 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Jan 23 18:30:27.925929 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Jan 23 18:30:27.926068 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Jan 23 18:30:27.926203 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Jan 23 18:30:27.926454 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Jan 23 18:30:27.926718 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Jan 23 18:30:27.926857 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Jan 23 18:30:27.927000 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Jan 23 18:30:27.927135 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Jan 23 18:30:27.927270 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Jan 23 18:30:27.927422 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Jan 23 18:30:27.927558 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Jan 23 18:30:27.927710 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Jan 23 18:30:27.927719 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 23 18:30:27.927726 kernel: PCI: CLS 0 bytes, default 64 Jan 23 18:30:27.927732 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 23 18:30:27.927739 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Jan 23 18:30:27.927748 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jan 23 18:30:27.927755 kernel: Initialise system trusted keyrings Jan 23 18:30:27.927761 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 23 18:30:27.927768 kernel: Key type asymmetric registered Jan 23 18:30:27.927774 kernel: Asymmetric key parser 'x509' registered Jan 23 18:30:27.927780 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 23 18:30:27.927786 kernel: io scheduler mq-deadline registered Jan 23 18:30:27.927795 kernel: io scheduler kyber registered Jan 23 18:30:27.927801 kernel: io scheduler bfq registered Jan 23 18:30:27.927943 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 23 18:30:27.928083 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 23 18:30:27.928222 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 23 18:30:27.928370 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 23 18:30:27.928509 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 23 18:30:27.928663 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 23 18:30:27.928807 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 23 18:30:27.928947 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 23 18:30:27.929089 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 23 18:30:27.929228 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 23 18:30:27.929375 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 23 18:30:27.929518 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 23 18:30:27.929667 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 23 18:30:27.929806 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 23 18:30:27.929945 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 23 18:30:27.930083 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 23 18:30:27.930091 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 23 18:30:27.930232 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 23 18:30:27.930379 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 23 18:30:27.930387 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 23 18:30:27.930394 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 23 18:30:27.930400 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 23 18:30:27.930407 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 23 18:30:27.930416 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 23 18:30:27.930423 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 23 18:30:27.930429 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 23 18:30:27.930575 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 23 18:30:27.930588 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 23 18:30:27.930735 kernel: rtc_cmos 00:03: registered as rtc0 Jan 23 18:30:27.930875 kernel: rtc_cmos 00:03: setting system clock to 2026-01-23T18:30:25 UTC (1769193025) Jan 23 18:30:27.931009 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Jan 23 18:30:27.931017 kernel: amd_pstate: The CPPC feature is supported but currently disabled by the BIOS. Please enable it if your BIOS has the CPPC option. Jan 23 18:30:27.931024 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Jan 23 18:30:27.931031 kernel: efifb: probing for efifb Jan 23 18:30:27.931037 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 23 18:30:27.931043 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 23 18:30:27.931052 kernel: efifb: scrolling: redraw Jan 23 18:30:27.931059 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 23 18:30:27.931065 kernel: Console: switching to colour frame buffer device 160x50 Jan 23 18:30:27.931071 kernel: fb0: EFI VGA frame buffer device Jan 23 18:30:27.931080 kernel: pstore: Using crash dump compression: deflate Jan 23 18:30:27.931086 kernel: pstore: Registered efi_pstore as persistent store backend Jan 23 18:30:27.931092 kernel: NET: Registered PF_INET6 protocol family Jan 23 18:30:27.931100 kernel: Segment Routing with IPv6 Jan 23 18:30:27.931107 kernel: In-situ OAM (IOAM) with IPv6 Jan 23 18:30:27.931113 kernel: NET: Registered PF_PACKET protocol family Jan 23 18:30:27.931120 kernel: Key type dns_resolver registered Jan 23 18:30:27.931126 kernel: IPI shorthand broadcast: enabled Jan 23 18:30:27.931132 kernel: sched_clock: Marking stable (1978011132, 237565231)->(2240600393, -25024030) Jan 23 18:30:27.931138 kernel: registered taskstats version 1 Jan 23 18:30:27.931147 kernel: Loading compiled-in X.509 certificates Jan 23 18:30:27.931153 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: ed4528912f8413ae803010e63385bcf7ed197cf1' Jan 23 18:30:27.931160 kernel: Demotion targets for Node 0: null Jan 23 18:30:27.931166 kernel: Key type .fscrypt registered Jan 23 18:30:27.931172 kernel: Key type fscrypt-provisioning registered Jan 23 18:30:27.931179 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 23 18:30:27.931185 kernel: ima: Allocated hash algorithm: sha1 Jan 23 18:30:27.931193 kernel: ima: No architecture policies found Jan 23 18:30:27.931200 kernel: clk: Disabling unused clocks Jan 23 18:30:27.931207 kernel: Freeing unused kernel image (initmem) memory: 15532K Jan 23 18:30:27.931213 kernel: Write protecting the kernel read-only data: 47104k Jan 23 18:30:27.931219 kernel: Freeing unused kernel image (rodata/data gap) memory: 1132K Jan 23 18:30:27.931226 kernel: Run /init as init process Jan 23 18:30:27.931232 kernel: with arguments: Jan 23 18:30:27.931241 kernel: /init Jan 23 18:30:27.931247 kernel: with environment: Jan 23 18:30:27.931254 kernel: HOME=/ Jan 23 18:30:27.931260 kernel: TERM=linux Jan 23 18:30:27.931266 kernel: ACPI: bus type USB registered Jan 23 18:30:27.931273 kernel: usbcore: registered new interface driver usbfs Jan 23 18:30:27.931279 kernel: usbcore: registered new interface driver hub Jan 23 18:30:27.931286 kernel: usbcore: registered new device driver usb Jan 23 18:30:27.931445 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 23 18:30:27.931590 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 23 18:30:27.931745 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 23 18:30:27.931890 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 23 18:30:27.932035 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 23 18:30:27.932180 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 23 18:30:27.932365 kernel: hub 1-0:1.0: USB hub found Jan 23 18:30:27.932521 kernel: hub 1-0:1.0: 4 ports detected Jan 23 18:30:27.934976 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 23 18:30:27.935152 kernel: hub 2-0:1.0: USB hub found Jan 23 18:30:27.935310 kernel: hub 2-0:1.0: 4 ports detected Jan 23 18:30:27.935323 kernel: SCSI subsystem initialized Jan 23 18:30:27.935329 kernel: libata version 3.00 loaded. Jan 23 18:30:27.935481 kernel: ahci 0000:00:1f.2: version 3.0 Jan 23 18:30:27.935490 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 23 18:30:27.935645 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 23 18:30:27.935788 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 23 18:30:27.935928 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 23 18:30:27.936089 kernel: scsi host0: ahci Jan 23 18:30:27.936242 kernel: scsi host1: ahci Jan 23 18:30:27.936402 kernel: scsi host2: ahci Jan 23 18:30:27.936553 kernel: scsi host3: ahci Jan 23 18:30:27.936725 kernel: scsi host4: ahci Jan 23 18:30:27.936880 kernel: scsi host5: ahci Jan 23 18:30:27.936888 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 38 lpm-pol 1 Jan 23 18:30:27.936895 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 38 lpm-pol 1 Jan 23 18:30:27.936902 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 38 lpm-pol 1 Jan 23 18:30:27.936908 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 38 lpm-pol 1 Jan 23 18:30:27.936915 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 38 lpm-pol 1 Jan 23 18:30:27.936924 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 38 lpm-pol 1 Jan 23 18:30:27.937093 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 23 18:30:27.937102 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 23 18:30:27.937109 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 23 18:30:27.937115 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 23 18:30:27.937122 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 23 18:30:27.937130 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 23 18:30:27.937137 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Jan 23 18:30:27.937143 kernel: ata1.00: LPM support broken, forcing max_power Jan 23 18:30:27.937150 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Jan 23 18:30:27.937156 kernel: ata1.00: applying bridge limits Jan 23 18:30:27.937162 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 23 18:30:27.937169 kernel: ata1.00: LPM support broken, forcing max_power Jan 23 18:30:27.937175 kernel: ata1.00: configured for UDMA/100 Jan 23 18:30:27.937343 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Jan 23 18:30:27.937357 kernel: usbcore: registered new interface driver usbhid Jan 23 18:30:27.937364 kernel: usbhid: USB HID core driver Jan 23 18:30:27.937519 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Jan 23 18:30:27.937527 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 23 18:30:27.937695 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jan 23 18:30:27.937854 kernel: scsi host6: Virtio SCSI HBA Jan 23 18:30:27.938022 kernel: scsi 6:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jan 23 18:30:27.938183 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jan 23 18:30:27.938191 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jan 23 18:30:27.938357 kernel: sd 6:0:0:0: Power-on or device reset occurred Jan 23 18:30:27.938540 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 23 18:30:27.940200 kernel: sd 6:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Jan 23 18:30:27.940383 kernel: sd 6:0:0:0: [sda] Write Protect is off Jan 23 18:30:27.940547 kernel: sd 6:0:0:0: [sda] Mode Sense: 63 00 00 08 Jan 23 18:30:27.940727 kernel: sd 6:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jan 23 18:30:27.940735 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 23 18:30:27.940746 kernel: GPT:25804799 != 160006143 Jan 23 18:30:27.940752 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 23 18:30:27.940759 kernel: GPT:25804799 != 160006143 Jan 23 18:30:27.940765 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 23 18:30:27.940771 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 23 18:30:27.940931 kernel: sd 6:0:0:0: [sda] Attached SCSI disk Jan 23 18:30:27.940939 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 23 18:30:27.940948 kernel: device-mapper: uevent: version 1.0.3 Jan 23 18:30:27.940955 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 23 18:30:27.940961 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 23 18:30:27.940967 kernel: raid6: avx512x4 gen() 19281 MB/s Jan 23 18:30:27.940974 kernel: raid6: avx512x2 gen() 22220 MB/s Jan 23 18:30:27.940980 kernel: raid6: avx512x1 gen() 23658 MB/s Jan 23 18:30:27.940986 kernel: raid6: avx2x4 gen() 46120 MB/s Jan 23 18:30:27.940995 kernel: raid6: avx2x2 gen() 49398 MB/s Jan 23 18:30:27.941001 kernel: raid6: avx2x1 gen() 41077 MB/s Jan 23 18:30:27.941007 kernel: raid6: using algorithm avx2x2 gen() 49398 MB/s Jan 23 18:30:27.941013 kernel: raid6: .... xor() 37040 MB/s, rmw enabled Jan 23 18:30:27.941020 kernel: raid6: using avx512x2 recovery algorithm Jan 23 18:30:27.941026 kernel: xor: automatically using best checksumming function avx Jan 23 18:30:27.941033 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 23 18:30:27.941041 kernel: BTRFS: device fsid ae5f9861-c401-42b4-99c9-2e3fe0b343c2 devid 1 transid 34 /dev/mapper/usr (254:0) scanned by mount (183) Jan 23 18:30:27.941048 kernel: BTRFS info (device dm-0): first mount of filesystem ae5f9861-c401-42b4-99c9-2e3fe0b343c2 Jan 23 18:30:27.941055 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 23 18:30:27.941061 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 23 18:30:27.941067 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 23 18:30:27.941074 kernel: BTRFS info (device dm-0): enabling free space tree Jan 23 18:30:27.941080 kernel: loop: module loaded Jan 23 18:30:27.941089 kernel: loop0: detected capacity change from 0 to 100560 Jan 23 18:30:27.941095 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 23 18:30:27.941103 systemd[1]: Successfully made /usr/ read-only. Jan 23 18:30:27.941111 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 18:30:27.941118 systemd[1]: Detected virtualization kvm. Jan 23 18:30:27.941125 systemd[1]: Detected architecture x86-64. Jan 23 18:30:27.941134 systemd[1]: Running in initrd. Jan 23 18:30:27.941140 systemd[1]: No hostname configured, using default hostname. Jan 23 18:30:27.941147 systemd[1]: Hostname set to . Jan 23 18:30:27.941154 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 23 18:30:27.941160 systemd[1]: Queued start job for default target initrd.target. Jan 23 18:30:27.941167 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 23 18:30:27.941174 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 18:30:27.941183 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 18:30:27.941199 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 23 18:30:27.941212 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 18:30:27.941219 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 23 18:30:27.941226 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 23 18:30:27.941236 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 18:30:27.941243 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 18:30:27.941250 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 23 18:30:27.941256 systemd[1]: Reached target paths.target - Path Units. Jan 23 18:30:27.941263 systemd[1]: Reached target slices.target - Slice Units. Jan 23 18:30:27.941270 systemd[1]: Reached target swap.target - Swaps. Jan 23 18:30:27.941276 systemd[1]: Reached target timers.target - Timer Units. Jan 23 18:30:27.941286 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 18:30:27.941292 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 18:30:27.941299 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 23 18:30:27.941306 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 23 18:30:27.941312 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 23 18:30:27.941319 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 18:30:27.941326 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 18:30:27.941335 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 18:30:27.941341 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 18:30:27.941355 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 23 18:30:27.941362 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 23 18:30:27.941368 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 18:30:27.941375 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 23 18:30:27.941385 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 23 18:30:27.941391 systemd[1]: Starting systemd-fsck-usr.service... Jan 23 18:30:27.941398 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 18:30:27.941405 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 18:30:27.941412 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:30:27.941421 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 23 18:30:27.941427 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 18:30:27.941434 systemd[1]: Finished systemd-fsck-usr.service. Jan 23 18:30:27.941441 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 23 18:30:27.941472 systemd-journald[321]: Collecting audit messages is enabled. Jan 23 18:30:27.941491 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 23 18:30:27.941499 kernel: audit: type=1130 audit(1769193027.924:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.941505 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 23 18:30:27.941514 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 18:30:27.941522 systemd-journald[321]: Journal started Jan 23 18:30:27.941536 systemd-journald[321]: Runtime Journal (/run/log/journal/57e2e020f17145f78e03fb39fdc0acda) is 8M, max 76M, 68M free. Jan 23 18:30:27.924000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.946044 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 18:30:27.944000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.951616 kernel: audit: type=1130 audit(1769193027.944:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.953643 kernel: Bridge firewalling registered Jan 23 18:30:27.954146 systemd-modules-load[323]: Inserted module 'br_netfilter' Jan 23 18:30:27.954000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.954360 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 18:30:27.964714 kernel: audit: type=1130 audit(1769193027.954:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.955099 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:30:27.957821 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 23 18:30:27.968772 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 18:30:27.971720 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 18:30:27.980670 kernel: audit: type=1130 audit(1769193027.969:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.969000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.973615 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 18:30:27.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.979440 systemd-tmpfiles[336]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 23 18:30:27.988766 kernel: audit: type=1130 audit(1769193027.981:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.992000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.993140 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 18:30:27.999628 kernel: audit: type=1130 audit(1769193027.992:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.999742 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 18:30:28.006030 kernel: audit: type=1130 audit(1769193027.999:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:27.999000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.006750 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 23 18:30:28.010672 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 18:30:28.010000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.016616 kernel: audit: type=1130 audit(1769193028.010:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.017000 audit: BPF prog-id=6 op=LOAD Jan 23 18:30:28.022988 kernel: audit: type=1334 audit(1769193028.017:10): prog-id=6 op=LOAD Jan 23 18:30:28.022716 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 18:30:28.032210 dracut-cmdline[356]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=ee2a61adbfdca0d8850a6d1564f6a5daa8e67e4645be01ed76a79270fe7c1051 Jan 23 18:30:28.071279 systemd-resolved[360]: Positive Trust Anchors: Jan 23 18:30:28.071951 systemd-resolved[360]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 18:30:28.071958 systemd-resolved[360]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 23 18:30:28.071980 systemd-resolved[360]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 18:30:28.092675 systemd-resolved[360]: Defaulting to hostname 'linux'. Jan 23 18:30:28.093000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.094055 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 18:30:28.094555 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 18:30:28.119626 kernel: Loading iSCSI transport class v2.0-870. Jan 23 18:30:28.132628 kernel: iscsi: registered transport (tcp) Jan 23 18:30:28.164284 kernel: iscsi: registered transport (qla4xxx) Jan 23 18:30:28.164334 kernel: QLogic iSCSI HBA Driver Jan 23 18:30:28.195478 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 18:30:28.225894 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 18:30:28.225000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.226693 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 18:30:28.289465 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 23 18:30:28.289000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.292527 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 23 18:30:28.296765 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 23 18:30:28.334320 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 23 18:30:28.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.334000 audit: BPF prog-id=7 op=LOAD Jan 23 18:30:28.335000 audit: BPF prog-id=8 op=LOAD Jan 23 18:30:28.336583 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 18:30:28.359439 systemd-udevd[604]: Using default interface naming scheme 'v257'. Jan 23 18:30:28.369000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.369417 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 18:30:28.373449 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 23 18:30:28.403775 dracut-pre-trigger[642]: rd.md=0: removing MD RAID activation Jan 23 18:30:28.422438 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 18:30:28.422000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.423000 audit: BPF prog-id=9 op=LOAD Jan 23 18:30:28.425268 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 18:30:28.447871 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 18:30:28.448000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.452828 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 18:30:28.471108 systemd-networkd[719]: lo: Link UP Jan 23 18:30:28.471854 systemd-networkd[719]: lo: Gained carrier Jan 23 18:30:28.473069 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 18:30:28.473000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.474253 systemd[1]: Reached target network.target - Network. Jan 23 18:30:28.565484 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 18:30:28.565000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.568721 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 23 18:30:28.689416 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jan 23 18:30:28.712006 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jan 23 18:30:28.731669 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jan 23 18:30:28.734626 kernel: cryptd: max_cpu_qlen set to 1000 Jan 23 18:30:28.739736 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 23 18:30:28.744989 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 23 18:30:28.771869 disk-uuid[783]: Primary Header is updated. Jan 23 18:30:28.771869 disk-uuid[783]: Secondary Entries is updated. Jan 23 18:30:28.771869 disk-uuid[783]: Secondary Header is updated. Jan 23 18:30:28.773365 kernel: AES CTR mode by8 optimization enabled Jan 23 18:30:28.777918 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 18:30:28.778042 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:30:28.780000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.780944 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:30:28.792668 systemd-networkd[719]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:30:28.792676 systemd-networkd[719]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 18:30:28.793007 systemd-networkd[719]: eth0: Link UP Jan 23 18:30:28.798868 systemd-networkd[719]: eth0: Gained carrier Jan 23 18:30:28.798884 systemd-networkd[719]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:30:28.799962 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Jan 23 18:30:28.800936 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:30:28.807559 systemd-networkd[719]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:30:28.807566 systemd-networkd[719]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 18:30:28.807858 systemd-networkd[719]: eth1: Link UP Jan 23 18:30:28.811234 systemd-networkd[719]: eth1: Gained carrier Jan 23 18:30:28.811245 systemd-networkd[719]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:30:28.816904 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 18:30:28.820046 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:30:28.827000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.827000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.838651 systemd-networkd[719]: eth0: DHCPv4 address 46.62.169.9/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 23 18:30:28.842916 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:30:28.845651 systemd-networkd[719]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 23 18:30:28.866991 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:30:28.867000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.869369 kernel: kauditd_printk_skb: 15 callbacks suppressed Jan 23 18:30:28.869399 kernel: audit: type=1130 audit(1769193028.867:26): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.942287 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 23 18:30:28.948436 kernel: audit: type=1130 audit(1769193028.942:27): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.942000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.943317 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 18:30:28.948840 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 18:30:28.949588 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 18:30:28.951186 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 23 18:30:28.972971 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 23 18:30:28.979288 kernel: audit: type=1130 audit(1769193028.973:28): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:28.973000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:29.851966 disk-uuid[784]: Warning: The kernel is still using the old partition table. Jan 23 18:30:29.851966 disk-uuid[784]: The new table will be used at the next reboot or after you Jan 23 18:30:29.851966 disk-uuid[784]: run partprobe(8) or kpartx(8) Jan 23 18:30:29.851966 disk-uuid[784]: The operation has completed successfully. Jan 23 18:30:29.862548 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 23 18:30:29.862818 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 23 18:30:29.889692 kernel: audit: type=1130 audit(1769193029.864:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:29.889747 kernel: audit: type=1131 audit(1769193029.864:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:29.864000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:29.864000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:29.866936 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 23 18:30:29.874908 systemd-networkd[719]: eth0: Gained IPv6LL Jan 23 18:30:29.934665 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (904) Jan 23 18:30:29.941883 kernel: BTRFS info (device sda6): first mount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:30:29.941935 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 23 18:30:29.955465 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 23 18:30:29.955519 kernel: BTRFS info (device sda6): turning on async discard Jan 23 18:30:29.958660 kernel: BTRFS info (device sda6): enabling free space tree Jan 23 18:30:29.978655 kernel: BTRFS info (device sda6): last unmount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:30:29.980115 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 23 18:30:29.994360 kernel: audit: type=1130 audit(1769193029.980:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:29.980000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:29.984007 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 23 18:30:30.216409 ignition[923]: Ignition 2.24.0 Jan 23 18:30:30.216435 ignition[923]: Stage: fetch-offline Jan 23 18:30:30.216501 ignition[923]: no configs at "/usr/lib/ignition/base.d" Jan 23 18:30:30.216523 ignition[923]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 23 18:30:30.219411 ignition[923]: parsed url from cmdline: "" Jan 23 18:30:30.219422 ignition[923]: no config URL provided Jan 23 18:30:30.219441 ignition[923]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 18:30:30.219466 ignition[923]: no config at "/usr/lib/ignition/user.ign" Jan 23 18:30:30.238954 kernel: audit: type=1130 audit(1769193030.224:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.224000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.224088 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 18:30:30.219478 ignition[923]: failed to fetch config: resource requires networking Jan 23 18:30:30.228936 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 23 18:30:30.220056 ignition[923]: Ignition finished successfully Jan 23 18:30:30.270547 ignition[930]: Ignition 2.24.0 Jan 23 18:30:30.270572 ignition[930]: Stage: fetch Jan 23 18:30:30.270848 ignition[930]: no configs at "/usr/lib/ignition/base.d" Jan 23 18:30:30.270870 ignition[930]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 23 18:30:30.271024 ignition[930]: parsed url from cmdline: "" Jan 23 18:30:30.271032 ignition[930]: no config URL provided Jan 23 18:30:30.271048 ignition[930]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 18:30:30.271063 ignition[930]: no config at "/usr/lib/ignition/user.ign" Jan 23 18:30:30.271102 ignition[930]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jan 23 18:30:30.283219 ignition[930]: GET result: OK Jan 23 18:30:30.283576 ignition[930]: parsing config with SHA512: 55c6eb25578862af6deeb86bf0ffd4732e3ee42d0df99c805da66a43961ef277f086b089295ee0c7201e3b37427a886b961cb7e1a95611caee3af850d420e611 Jan 23 18:30:30.293792 unknown[930]: fetched base config from "system" Jan 23 18:30:30.293813 unknown[930]: fetched base config from "system" Jan 23 18:30:30.294331 ignition[930]: fetch: fetch complete Jan 23 18:30:30.293825 unknown[930]: fetched user config from "hetzner" Jan 23 18:30:30.294342 ignition[930]: fetch: fetch passed Jan 23 18:30:30.294438 ignition[930]: Ignition finished successfully Jan 23 18:30:30.300158 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 23 18:30:30.313802 kernel: audit: type=1130 audit(1769193030.300:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.300000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.303871 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 23 18:30:30.351106 ignition[937]: Ignition 2.24.0 Jan 23 18:30:30.351128 ignition[937]: Stage: kargs Jan 23 18:30:30.351338 ignition[937]: no configs at "/usr/lib/ignition/base.d" Jan 23 18:30:30.351356 ignition[937]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 23 18:30:30.354360 ignition[937]: kargs: kargs passed Jan 23 18:30:30.354448 ignition[937]: Ignition finished successfully Jan 23 18:30:30.357165 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 23 18:30:30.368441 kernel: audit: type=1130 audit(1769193030.357:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.357000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.361253 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 23 18:30:30.397119 ignition[944]: Ignition 2.24.0 Jan 23 18:30:30.398307 ignition[944]: Stage: disks Jan 23 18:30:30.398675 ignition[944]: no configs at "/usr/lib/ignition/base.d" Jan 23 18:30:30.398698 ignition[944]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 23 18:30:30.401859 ignition[944]: disks: disks passed Jan 23 18:30:30.402498 ignition[944]: Ignition finished successfully Jan 23 18:30:30.406222 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 23 18:30:30.417436 kernel: audit: type=1130 audit(1769193030.406:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.406000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.407565 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 23 18:30:30.418309 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 23 18:30:30.419684 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 18:30:30.421037 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 18:30:30.422412 systemd[1]: Reached target basic.target - Basic System. Jan 23 18:30:30.425866 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 23 18:30:30.476497 systemd-fsck[952]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 23 18:30:30.480000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.479954 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 23 18:30:30.484597 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 23 18:30:30.624625 kernel: EXT4-fs (sda9): mounted filesystem eebf2bdd-2461-4b18-9f37-721daf86511d r/w with ordered data mode. Quota mode: none. Jan 23 18:30:30.626254 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 23 18:30:30.628164 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 23 18:30:30.631656 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 18:30:30.634170 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 23 18:30:30.637107 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jan 23 18:30:30.637861 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 23 18:30:30.638554 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 18:30:30.638723 systemd-networkd[719]: eth1: Gained IPv6LL Jan 23 18:30:30.651628 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (960) Jan 23 18:30:30.656625 kernel: BTRFS info (device sda6): first mount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:30:30.656843 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 23 18:30:30.657629 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 23 18:30:30.662209 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 23 18:30:30.679405 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 23 18:30:30.679434 kernel: BTRFS info (device sda6): turning on async discard Jan 23 18:30:30.679444 kernel: BTRFS info (device sda6): enabling free space tree Jan 23 18:30:30.685204 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 18:30:30.780214 coreos-metadata[962]: Jan 23 18:30:30.780 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jan 23 18:30:30.782941 coreos-metadata[962]: Jan 23 18:30:30.782 INFO Fetch successful Jan 23 18:30:30.782941 coreos-metadata[962]: Jan 23 18:30:30.782 INFO wrote hostname ci-4547-1-0-c-e2d32aff86 to /sysroot/etc/hostname Jan 23 18:30:30.787232 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 23 18:30:30.787000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.970203 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 23 18:30:30.970000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:30.974130 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 23 18:30:30.977238 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 23 18:30:31.004405 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 23 18:30:31.008861 kernel: BTRFS info (device sda6): last unmount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:30:31.043336 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 23 18:30:31.044000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:31.053681 ignition[1062]: INFO : Ignition 2.24.0 Jan 23 18:30:31.055635 ignition[1062]: INFO : Stage: mount Jan 23 18:30:31.055635 ignition[1062]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 18:30:31.055635 ignition[1062]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 23 18:30:31.059638 ignition[1062]: INFO : mount: mount passed Jan 23 18:30:31.061661 ignition[1062]: INFO : Ignition finished successfully Jan 23 18:30:31.064092 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 23 18:30:31.064000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:31.067312 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 23 18:30:31.096006 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 18:30:31.130660 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1073) Jan 23 18:30:31.130722 kernel: BTRFS info (device sda6): first mount of filesystem 65a96faf-6d02-485d-b2fc-84eb49ece660 Jan 23 18:30:31.135699 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 23 18:30:31.152079 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 23 18:30:31.152127 kernel: BTRFS info (device sda6): turning on async discard Jan 23 18:30:31.152157 kernel: BTRFS info (device sda6): enabling free space tree Jan 23 18:30:31.160459 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 18:30:31.202230 ignition[1089]: INFO : Ignition 2.24.0 Jan 23 18:30:31.202230 ignition[1089]: INFO : Stage: files Jan 23 18:30:31.204324 ignition[1089]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 18:30:31.204324 ignition[1089]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 23 18:30:31.204324 ignition[1089]: DEBUG : files: compiled without relabeling support, skipping Jan 23 18:30:31.206659 ignition[1089]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 23 18:30:31.206659 ignition[1089]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 23 18:30:31.213497 ignition[1089]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 23 18:30:31.214765 ignition[1089]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 23 18:30:31.214765 ignition[1089]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 23 18:30:31.214364 unknown[1089]: wrote ssh authorized keys file for user: core Jan 23 18:30:31.217912 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 23 18:30:31.217912 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jan 23 18:30:31.458390 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 23 18:30:31.765071 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 23 18:30:31.767080 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 23 18:30:31.767080 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 23 18:30:31.767080 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 23 18:30:31.767080 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 23 18:30:31.767080 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 18:30:31.767080 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 18:30:31.767080 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 18:30:31.767080 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 18:30:31.774210 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 18:30:31.774210 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 18:30:31.774210 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 23 18:30:31.774210 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 23 18:30:31.774210 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 23 18:30:31.774210 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jan 23 18:30:32.121862 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 23 18:30:34.716765 ignition[1089]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 23 18:30:34.716765 ignition[1089]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 23 18:30:34.720259 ignition[1089]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 18:30:34.722847 ignition[1089]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 18:30:34.722847 ignition[1089]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 23 18:30:34.722847 ignition[1089]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jan 23 18:30:34.729197 ignition[1089]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 23 18:30:34.729197 ignition[1089]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 23 18:30:34.729197 ignition[1089]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jan 23 18:30:34.729197 ignition[1089]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jan 23 18:30:34.729197 ignition[1089]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jan 23 18:30:34.729197 ignition[1089]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 23 18:30:34.729197 ignition[1089]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 23 18:30:34.729197 ignition[1089]: INFO : files: files passed Jan 23 18:30:34.729197 ignition[1089]: INFO : Ignition finished successfully Jan 23 18:30:34.789990 kernel: kauditd_printk_skb: 5 callbacks suppressed Jan 23 18:30:34.790029 kernel: audit: type=1130 audit(1769193034.728:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.790067 kernel: audit: type=1130 audit(1769193034.761:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.790091 kernel: audit: type=1131 audit(1769193034.761:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.728000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.761000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.761000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.728665 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 23 18:30:34.732866 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 23 18:30:34.754784 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 23 18:30:34.760350 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 23 18:30:34.760566 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 23 18:30:34.817689 initrd-setup-root-after-ignition[1122]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 18:30:34.817689 initrd-setup-root-after-ignition[1122]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 23 18:30:34.821061 initrd-setup-root-after-ignition[1126]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 18:30:34.822565 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 18:30:34.838660 kernel: audit: type=1130 audit(1769193034.823:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.823000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.824811 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 23 18:30:34.841365 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 23 18:30:34.922217 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 23 18:30:34.922461 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 23 18:30:34.950564 kernel: audit: type=1130 audit(1769193034.923:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.950631 kernel: audit: type=1131 audit(1769193034.924:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.923000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.924000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:34.925047 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 23 18:30:34.951673 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 23 18:30:34.954130 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 23 18:30:34.956091 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 23 18:30:35.011768 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 18:30:35.026469 kernel: audit: type=1130 audit(1769193035.012:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.012000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.016850 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 23 18:30:35.055973 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 23 18:30:35.056347 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 23 18:30:35.057691 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 18:30:35.059803 systemd[1]: Stopped target timers.target - Timer Units. Jan 23 18:30:35.077010 kernel: audit: type=1131 audit(1769193035.063:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.063000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.061842 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 23 18:30:35.062092 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 18:30:35.077171 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 23 18:30:35.079360 systemd[1]: Stopped target basic.target - Basic System. Jan 23 18:30:35.081239 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 23 18:30:35.083260 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 18:30:35.085128 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 23 18:30:35.087175 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 23 18:30:35.089131 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 23 18:30:35.091077 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 18:30:35.093224 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 23 18:30:35.095304 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 23 18:30:35.097436 systemd[1]: Stopped target swap.target - Swaps. Jan 23 18:30:35.115911 kernel: audit: type=1131 audit(1769193035.101:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.101000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.099501 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 23 18:30:35.099749 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 23 18:30:35.116054 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 23 18:30:35.118030 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 18:30:35.119905 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 23 18:30:35.139890 kernel: audit: type=1131 audit(1769193035.123:50): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.123000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.120106 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 18:30:35.121700 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 23 18:30:35.121876 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 23 18:30:35.142000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.140247 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 23 18:30:35.144000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.140483 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 18:30:35.146000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.143335 systemd[1]: ignition-files.service: Deactivated successfully. Jan 23 18:30:35.143593 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 23 18:30:35.145183 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jan 23 18:30:35.145442 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 23 18:30:35.149780 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 23 18:30:35.151431 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 23 18:30:35.155000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.152806 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 18:30:35.158947 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 23 18:30:35.160224 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 23 18:30:35.162000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.160512 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 18:30:35.163278 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 23 18:30:35.163542 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 18:30:35.168000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.168920 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 23 18:30:35.171000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.169098 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 18:30:35.184999 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 23 18:30:35.185195 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 23 18:30:35.187000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.188000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.199571 ignition[1146]: INFO : Ignition 2.24.0 Jan 23 18:30:35.199571 ignition[1146]: INFO : Stage: umount Jan 23 18:30:35.204323 ignition[1146]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 18:30:35.204323 ignition[1146]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 23 18:30:35.204323 ignition[1146]: INFO : umount: umount passed Jan 23 18:30:35.204323 ignition[1146]: INFO : Ignition finished successfully Jan 23 18:30:35.208168 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 23 18:30:35.208429 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 23 18:30:35.211000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.212352 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 23 18:30:35.212526 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 23 18:30:35.214000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.215476 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 23 18:30:35.216479 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 23 18:30:35.217000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.218388 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 23 18:30:35.219372 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 23 18:30:35.220000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.221247 systemd[1]: Stopped target network.target - Network. Jan 23 18:30:35.222965 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 23 18:30:35.223900 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 18:30:35.224000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.225682 systemd[1]: Stopped target paths.target - Path Units. Jan 23 18:30:35.226912 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 23 18:30:35.227450 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 18:30:35.229146 systemd[1]: Stopped target slices.target - Slice Units. Jan 23 18:30:35.230673 systemd[1]: Stopped target sockets.target - Socket Units. Jan 23 18:30:35.231717 systemd[1]: iscsid.socket: Deactivated successfully. Jan 23 18:30:35.231802 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 18:30:35.233356 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 23 18:30:35.233442 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 18:30:35.238000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.234835 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 23 18:30:35.239000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.234904 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 23 18:30:35.236247 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 23 18:30:35.236343 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 23 18:30:35.260000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.239495 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 23 18:30:35.239591 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 23 18:30:35.240742 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 23 18:30:35.245620 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 23 18:30:35.257763 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 23 18:30:35.259931 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 23 18:30:35.260107 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 23 18:30:35.268466 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 23 18:30:35.269726 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 23 18:30:35.269000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.274053 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 23 18:30:35.274282 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 23 18:30:35.274000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.275000 audit: BPF prog-id=6 op=UNLOAD Jan 23 18:30:35.276000 audit: BPF prog-id=9 op=UNLOAD Jan 23 18:30:35.277983 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 23 18:30:35.278892 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 23 18:30:35.278999 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 23 18:30:35.281000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.280358 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 23 18:30:35.280472 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 23 18:30:35.283826 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 23 18:30:35.284640 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 23 18:30:35.286000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.287000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.287000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.284749 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 18:30:35.287045 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 23 18:30:35.287126 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 23 18:30:35.287834 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 23 18:30:35.287900 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 23 18:30:35.292071 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 18:30:35.309635 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 23 18:30:35.310548 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 18:30:35.312291 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 23 18:30:35.311000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.312366 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 23 18:30:35.314529 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 23 18:30:35.316000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.314594 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 18:30:35.315251 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 23 18:30:35.315317 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 23 18:30:35.316930 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 23 18:30:35.319000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.317007 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 23 18:30:35.320461 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 23 18:30:35.321000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.320559 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 18:30:35.324285 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 23 18:30:35.325000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.325130 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 23 18:30:35.328000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.325234 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 18:30:35.330000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.326257 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 23 18:30:35.326347 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 18:30:35.329570 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 18:30:35.329695 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:30:35.350673 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 23 18:30:35.350908 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 23 18:30:35.351000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.363302 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 23 18:30:35.363577 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 23 18:30:35.364000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:35.365446 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 23 18:30:35.368135 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 23 18:30:35.391870 systemd[1]: Switching root. Jan 23 18:30:35.436156 systemd-journald[321]: Journal stopped Jan 23 18:30:36.684222 systemd-journald[321]: Received SIGTERM from PID 1 (systemd). Jan 23 18:30:36.684279 kernel: SELinux: policy capability network_peer_controls=1 Jan 23 18:30:36.684295 kernel: SELinux: policy capability open_perms=1 Jan 23 18:30:36.684307 kernel: SELinux: policy capability extended_socket_class=1 Jan 23 18:30:36.684316 kernel: SELinux: policy capability always_check_network=0 Jan 23 18:30:36.684325 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 23 18:30:36.684337 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 23 18:30:36.684347 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 23 18:30:36.684356 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 23 18:30:36.684365 kernel: SELinux: policy capability userspace_initial_context=0 Jan 23 18:30:36.684378 systemd[1]: Successfully loaded SELinux policy in 113.640ms. Jan 23 18:30:36.684400 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 17.528ms. Jan 23 18:30:36.684410 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 18:30:36.684433 systemd[1]: Detected virtualization kvm. Jan 23 18:30:36.684442 systemd[1]: Detected architecture x86-64. Jan 23 18:30:36.684452 systemd[1]: Detected first boot. Jan 23 18:30:36.684462 systemd[1]: Hostname set to . Jan 23 18:30:36.684471 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 23 18:30:36.684484 zram_generator::config[1190]: No configuration found. Jan 23 18:30:36.684503 kernel: Guest personality initialized and is inactive Jan 23 18:30:36.684512 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 23 18:30:36.684525 kernel: Initialized host personality Jan 23 18:30:36.684534 kernel: NET: Registered PF_VSOCK protocol family Jan 23 18:30:36.684544 systemd[1]: Populated /etc with preset unit settings. Jan 23 18:30:36.684553 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 23 18:30:36.684563 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 23 18:30:36.684573 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 23 18:30:36.684588 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 23 18:30:36.684621 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 23 18:30:36.684631 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 23 18:30:36.684641 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 23 18:30:36.684652 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 23 18:30:36.684661 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 23 18:30:36.684681 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 23 18:30:36.684690 systemd[1]: Created slice user.slice - User and Session Slice. Jan 23 18:30:36.684699 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 18:30:36.684710 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 18:30:36.684720 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 23 18:30:36.684729 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 23 18:30:36.684739 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 23 18:30:36.684751 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 18:30:36.684761 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 23 18:30:36.684771 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 18:30:36.684782 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 18:30:36.684792 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 23 18:30:36.684804 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 23 18:30:36.684814 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 23 18:30:36.684824 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 23 18:30:36.684833 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 18:30:36.684843 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 18:30:36.684853 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 23 18:30:36.684862 systemd[1]: Reached target slices.target - Slice Units. Jan 23 18:30:36.684875 systemd[1]: Reached target swap.target - Swaps. Jan 23 18:30:36.684885 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 23 18:30:36.684895 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 23 18:30:36.684905 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 23 18:30:36.684914 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 23 18:30:36.684924 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 23 18:30:36.684934 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 18:30:36.684945 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 23 18:30:36.684955 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 23 18:30:36.684964 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 18:30:36.684974 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 18:30:36.684983 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 23 18:30:36.684993 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 23 18:30:36.685002 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 23 18:30:36.685017 systemd[1]: Mounting media.mount - External Media Directory... Jan 23 18:30:36.685026 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:30:36.685036 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 23 18:30:36.685046 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 23 18:30:36.685056 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 23 18:30:36.685065 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 23 18:30:36.685075 systemd[1]: Reached target machines.target - Containers. Jan 23 18:30:36.685087 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 23 18:30:36.685097 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 18:30:36.685107 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 18:30:36.685116 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 23 18:30:36.685126 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 18:30:36.685135 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 18:30:36.685147 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 18:30:36.685156 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 23 18:30:36.685166 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 18:30:36.685176 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 23 18:30:36.685188 kernel: fuse: init (API version 7.41) Jan 23 18:30:36.685198 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 23 18:30:36.685208 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 23 18:30:36.685219 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 23 18:30:36.685229 systemd[1]: Stopped systemd-fsck-usr.service. Jan 23 18:30:36.685239 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 18:30:36.685248 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 18:30:36.685260 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 18:30:36.685270 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 18:30:36.685280 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 23 18:30:36.685289 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 23 18:30:36.685299 kernel: ACPI: bus type drm_connector registered Jan 23 18:30:36.685309 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 18:30:36.685318 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:30:36.685330 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 23 18:30:36.685340 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 23 18:30:36.685351 systemd[1]: Mounted media.mount - External Media Directory. Jan 23 18:30:36.685361 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 23 18:30:36.685372 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 23 18:30:36.685382 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 23 18:30:36.685391 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 18:30:36.685426 systemd-journald[1270]: Collecting audit messages is enabled. Jan 23 18:30:36.685445 systemd-journald[1270]: Journal started Jan 23 18:30:36.685463 systemd-journald[1270]: Runtime Journal (/run/log/journal/57e2e020f17145f78e03fb39fdc0acda) is 8M, max 76M, 68M free. Jan 23 18:30:36.431000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 23 18:30:36.594000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.601000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.607000 audit: BPF prog-id=14 op=UNLOAD Jan 23 18:30:36.607000 audit: BPF prog-id=13 op=UNLOAD Jan 23 18:30:36.607000 audit: BPF prog-id=15 op=LOAD Jan 23 18:30:36.607000 audit: BPF prog-id=16 op=LOAD Jan 23 18:30:36.607000 audit: BPF prog-id=17 op=LOAD Jan 23 18:30:36.682000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 23 18:30:36.682000 audit[1270]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=6 a1=7fff0b62ca00 a2=4000 a3=0 items=0 ppid=1 pid=1270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:36.682000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 23 18:30:36.685000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.347216 systemd[1]: Queued start job for default target multi-user.target. Jan 23 18:30:36.359970 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 23 18:30:36.360472 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 23 18:30:36.687653 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 18:30:36.688000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.692000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.692000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.691614 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 23 18:30:36.692261 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 23 18:30:36.692440 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 23 18:30:36.693083 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 18:30:36.693243 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 18:30:36.693000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.693000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.694160 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 18:30:36.694308 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 18:30:36.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.694000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.695285 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 18:30:36.695489 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 18:30:36.695000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.695000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.696237 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 23 18:30:36.696442 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 23 18:30:36.696000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.696000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.697240 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 18:30:36.697449 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 18:30:36.697000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.697000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.698201 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 18:30:36.698000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.699116 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 18:30:36.698000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.700459 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 23 18:30:36.700000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.701365 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 23 18:30:36.701000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.710211 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 18:30:36.711366 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 23 18:30:36.711860 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 23 18:30:36.711919 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 18:30:36.712995 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 23 18:30:36.713540 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 18:30:36.713711 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 23 18:30:36.717730 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 23 18:30:36.719711 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 23 18:30:36.720112 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 18:30:36.724863 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 23 18:30:36.726660 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 18:30:36.727477 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 18:30:36.730706 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 23 18:30:36.733737 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 23 18:30:36.742675 systemd-journald[1270]: Time spent on flushing to /var/log/journal/57e2e020f17145f78e03fb39fdc0acda is 66.638ms for 1370 entries. Jan 23 18:30:36.742675 systemd-journald[1270]: System Journal (/var/log/journal/57e2e020f17145f78e03fb39fdc0acda) is 8M, max 588.1M, 580.1M free. Jan 23 18:30:36.836476 systemd-journald[1270]: Received client request to flush runtime journal. Jan 23 18:30:36.836522 kernel: loop1: detected capacity change from 0 to 111560 Jan 23 18:30:36.836544 kernel: loop2: detected capacity change from 0 to 224512 Jan 23 18:30:36.755000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.793000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.833000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.834000 audit: BPF prog-id=18 op=LOAD Jan 23 18:30:36.834000 audit: BPF prog-id=19 op=LOAD Jan 23 18:30:36.834000 audit: BPF prog-id=20 op=LOAD Jan 23 18:30:36.756082 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 23 18:30:36.756853 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 23 18:30:36.760999 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 23 18:30:36.793538 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 18:30:36.833714 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 23 18:30:36.837779 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 23 18:30:36.838000 audit: BPF prog-id=21 op=LOAD Jan 23 18:30:36.841132 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 18:30:36.842817 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 18:30:36.844175 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 23 18:30:36.845000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.845000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.845998 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 23 18:30:36.865000 audit: BPF prog-id=22 op=LOAD Jan 23 18:30:36.866000 audit: BPF prog-id=23 op=LOAD Jan 23 18:30:36.867000 audit: BPF prog-id=24 op=LOAD Jan 23 18:30:36.868229 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 23 18:30:36.869000 audit: BPF prog-id=25 op=LOAD Jan 23 18:30:36.869000 audit: BPF prog-id=26 op=LOAD Jan 23 18:30:36.869000 audit: BPF prog-id=27 op=LOAD Jan 23 18:30:36.871138 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 23 18:30:36.876621 kernel: loop3: detected capacity change from 0 to 50784 Jan 23 18:30:36.911679 kernel: loop4: detected capacity change from 0 to 8 Jan 23 18:30:36.924000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.924254 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 18:30:36.924895 systemd-tmpfiles[1327]: ACLs are not supported, ignoring. Jan 23 18:30:36.924905 systemd-tmpfiles[1327]: ACLs are not supported, ignoring. Jan 23 18:30:36.929493 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 18:30:36.929000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.944617 kernel: loop5: detected capacity change from 0 to 111560 Jan 23 18:30:36.947706 systemd-nsresourced[1334]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 23 18:30:36.950000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.951184 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 23 18:30:36.957846 kernel: loop6: detected capacity change from 0 to 224512 Jan 23 18:30:36.963103 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 23 18:30:36.963000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:36.974626 kernel: loop7: detected capacity change from 0 to 50784 Jan 23 18:30:36.995625 kernel: loop1: detected capacity change from 0 to 8 Jan 23 18:30:36.998182 (sd-merge)[1341]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Jan 23 18:30:37.008352 (sd-merge)[1341]: Merged extensions into '/usr'. Jan 23 18:30:37.019651 systemd[1]: Reload requested from client PID 1313 ('systemd-sysext') (unit systemd-sysext.service)... Jan 23 18:30:37.019666 systemd[1]: Reloading... Jan 23 18:30:37.059113 systemd-oomd[1325]: No swap; memory pressure usage will be degraded Jan 23 18:30:37.090065 systemd-resolved[1326]: Positive Trust Anchors: Jan 23 18:30:37.090909 systemd-resolved[1326]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 18:30:37.091663 systemd-resolved[1326]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 23 18:30:37.091745 systemd-resolved[1326]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 18:30:37.100623 zram_generator::config[1385]: No configuration found. Jan 23 18:30:37.112363 systemd-resolved[1326]: Using system hostname 'ci-4547-1-0-c-e2d32aff86'. Jan 23 18:30:37.269046 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 23 18:30:37.269194 systemd[1]: Reloading finished in 249 ms. Jan 23 18:30:37.301303 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 23 18:30:37.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.301923 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 18:30:37.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.302691 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 23 18:30:37.302000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.305948 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 23 18:30:37.305000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.306980 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 18:30:37.311744 systemd[1]: Starting ensure-sysext.service... Jan 23 18:30:37.314000 audit: BPF prog-id=8 op=UNLOAD Jan 23 18:30:37.314000 audit: BPF prog-id=7 op=UNLOAD Jan 23 18:30:37.314000 audit: BPF prog-id=28 op=LOAD Jan 23 18:30:37.314000 audit: BPF prog-id=29 op=LOAD Jan 23 18:30:37.314713 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 18:30:37.316861 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 18:30:37.317000 audit: BPF prog-id=30 op=LOAD Jan 23 18:30:37.317000 audit: BPF prog-id=21 op=UNLOAD Jan 23 18:30:37.318000 audit: BPF prog-id=31 op=LOAD Jan 23 18:30:37.318000 audit: BPF prog-id=18 op=UNLOAD Jan 23 18:30:37.319000 audit: BPF prog-id=32 op=LOAD Jan 23 18:30:37.319000 audit: BPF prog-id=33 op=LOAD Jan 23 18:30:37.319000 audit: BPF prog-id=19 op=UNLOAD Jan 23 18:30:37.319000 audit: BPF prog-id=20 op=UNLOAD Jan 23 18:30:37.320000 audit: BPF prog-id=34 op=LOAD Jan 23 18:30:37.325000 audit: BPF prog-id=25 op=UNLOAD Jan 23 18:30:37.325000 audit: BPF prog-id=35 op=LOAD Jan 23 18:30:37.325000 audit: BPF prog-id=36 op=LOAD Jan 23 18:30:37.325000 audit: BPF prog-id=26 op=UNLOAD Jan 23 18:30:37.325000 audit: BPF prog-id=27 op=UNLOAD Jan 23 18:30:37.326000 audit: BPF prog-id=37 op=LOAD Jan 23 18:30:37.326000 audit: BPF prog-id=15 op=UNLOAD Jan 23 18:30:37.326000 audit: BPF prog-id=38 op=LOAD Jan 23 18:30:37.326000 audit: BPF prog-id=39 op=LOAD Jan 23 18:30:37.326000 audit: BPF prog-id=16 op=UNLOAD Jan 23 18:30:37.326000 audit: BPF prog-id=17 op=UNLOAD Jan 23 18:30:37.327000 audit: BPF prog-id=40 op=LOAD Jan 23 18:30:37.327000 audit: BPF prog-id=22 op=UNLOAD Jan 23 18:30:37.328000 audit: BPF prog-id=41 op=LOAD Jan 23 18:30:37.329000 audit: BPF prog-id=42 op=LOAD Jan 23 18:30:37.329000 audit: BPF prog-id=23 op=UNLOAD Jan 23 18:30:37.329000 audit: BPF prog-id=24 op=UNLOAD Jan 23 18:30:37.343399 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 23 18:30:37.343433 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 23 18:30:37.343697 systemd-tmpfiles[1429]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 23 18:30:37.344694 systemd-tmpfiles[1429]: ACLs are not supported, ignoring. Jan 23 18:30:37.344748 systemd-tmpfiles[1429]: ACLs are not supported, ignoring. Jan 23 18:30:37.345356 systemd[1]: Reload requested from client PID 1428 ('systemctl') (unit ensure-sysext.service)... Jan 23 18:30:37.345413 systemd[1]: Reloading... Jan 23 18:30:37.355700 systemd-tmpfiles[1429]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 18:30:37.357742 systemd-tmpfiles[1429]: Skipping /boot Jan 23 18:30:37.371963 systemd-tmpfiles[1429]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 18:30:37.372038 systemd-tmpfiles[1429]: Skipping /boot Jan 23 18:30:37.390142 systemd-udevd[1430]: Using default interface naming scheme 'v257'. Jan 23 18:30:37.431626 zram_generator::config[1462]: No configuration found. Jan 23 18:30:37.541624 kernel: mousedev: PS/2 mouse device common for all mice Jan 23 18:30:37.577648 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Jan 23 18:30:37.582622 kernel: ACPI: button: Power Button [PWRF] Jan 23 18:30:37.666862 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 23 18:30:37.667157 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 23 18:30:37.673851 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 23 18:30:37.681681 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 23 18:30:37.682215 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 23 18:30:37.682953 systemd[1]: Reloading finished in 337 ms. Jan 23 18:30:37.696115 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 18:30:37.697000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.698000 audit: BPF prog-id=43 op=LOAD Jan 23 18:30:37.701000 audit: BPF prog-id=31 op=UNLOAD Jan 23 18:30:37.701000 audit: BPF prog-id=44 op=LOAD Jan 23 18:30:37.701000 audit: BPF prog-id=45 op=LOAD Jan 23 18:30:37.701000 audit: BPF prog-id=32 op=UNLOAD Jan 23 18:30:37.701000 audit: BPF prog-id=33 op=UNLOAD Jan 23 18:30:37.703000 audit: BPF prog-id=46 op=LOAD Jan 23 18:30:37.703000 audit: BPF prog-id=30 op=UNLOAD Jan 23 18:30:37.703000 audit: BPF prog-id=47 op=LOAD Jan 23 18:30:37.703000 audit: BPF prog-id=40 op=UNLOAD Jan 23 18:30:37.703000 audit: BPF prog-id=48 op=LOAD Jan 23 18:30:37.703000 audit: BPF prog-id=49 op=LOAD Jan 23 18:30:37.703000 audit: BPF prog-id=41 op=UNLOAD Jan 23 18:30:37.703000 audit: BPF prog-id=42 op=UNLOAD Jan 23 18:30:37.704000 audit: BPF prog-id=50 op=LOAD Jan 23 18:30:37.704000 audit: BPF prog-id=37 op=UNLOAD Jan 23 18:30:37.704000 audit: BPF prog-id=51 op=LOAD Jan 23 18:30:37.704000 audit: BPF prog-id=52 op=LOAD Jan 23 18:30:37.704000 audit: BPF prog-id=38 op=UNLOAD Jan 23 18:30:37.704000 audit: BPF prog-id=39 op=UNLOAD Jan 23 18:30:37.710000 audit: BPF prog-id=53 op=LOAD Jan 23 18:30:37.712000 audit: BPF prog-id=34 op=UNLOAD Jan 23 18:30:37.712000 audit: BPF prog-id=54 op=LOAD Jan 23 18:30:37.712000 audit: BPF prog-id=55 op=LOAD Jan 23 18:30:37.712000 audit: BPF prog-id=35 op=UNLOAD Jan 23 18:30:37.712000 audit: BPF prog-id=36 op=UNLOAD Jan 23 18:30:37.713000 audit: BPF prog-id=56 op=LOAD Jan 23 18:30:37.718000 audit: BPF prog-id=57 op=LOAD Jan 23 18:30:37.718000 audit: BPF prog-id=28 op=UNLOAD Jan 23 18:30:37.718000 audit: BPF prog-id=29 op=UNLOAD Jan 23 18:30:37.723590 kernel: Console: switching to colour dummy device 80x25 Jan 23 18:30:37.724000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.724833 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 18:30:37.739626 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 23 18:30:37.764699 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 23 18:30:37.764765 kernel: [drm] features: -context_init Jan 23 18:30:37.770615 kernel: EDAC MC: Ver: 3.0.0 Jan 23 18:30:37.807648 kernel: [drm] number of scanouts: 1 Jan 23 18:30:37.816635 kernel: [drm] number of cap sets: 0 Jan 23 18:30:37.821634 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 23 18:30:37.823274 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jan 23 18:30:37.828590 systemd[1]: Finished ensure-sysext.service. Jan 23 18:30:37.828837 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 23 18:30:37.828870 kernel: Console: switching to colour frame buffer device 160x50 Jan 23 18:30:37.834620 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 23 18:30:37.836000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.841182 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 23 18:30:37.853060 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:30:37.854205 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 18:30:37.858719 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 23 18:30:37.858933 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 18:30:37.862292 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 23 18:30:37.865651 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 18:30:37.868977 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 18:30:37.870783 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 18:30:37.873965 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 23 18:30:37.875590 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 18:30:37.875979 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 18:30:37.876079 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 23 18:30:37.882539 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 23 18:30:37.887483 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 23 18:30:37.891264 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 18:30:37.892728 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 23 18:30:37.897000 audit: BPF prog-id=58 op=LOAD Jan 23 18:30:37.899056 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 18:30:37.900000 audit: BPF prog-id=59 op=LOAD Jan 23 18:30:37.902618 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 23 18:30:37.904403 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 23 18:30:37.906349 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 18:30:37.906740 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 23 18:30:37.908308 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 18:30:37.908000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.909000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.909000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.909000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.909000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.910000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.908807 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 18:30:37.909880 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 18:30:37.910053 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 18:30:37.910363 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 23 18:30:37.920000 audit[1581]: SYSTEM_BOOT pid=1581 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.910535 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 23 18:30:37.917946 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 23 18:30:37.918719 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 18:30:37.936739 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 23 18:30:37.938000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.939342 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 23 18:30:37.939000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.939534 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 23 18:30:37.942000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.946033 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 18:30:37.946228 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 18:30:37.948000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.948000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.954000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.953456 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 18:30:37.953841 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 18:30:37.955000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.956496 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 23 18:30:37.960000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.961118 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 23 18:30:37.971698 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 23 18:30:37.973784 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 18:30:37.976000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.976226 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 23 18:30:37.977393 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 23 18:30:37.983317 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 23 18:30:37.996000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:37.993906 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 23 18:30:38.013000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 23 18:30:38.013000 audit[1615]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd2d371b70 a2=420 a3=0 items=0 ppid=1560 pid=1615 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:38.013000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 18:30:38.014611 augenrules[1615]: No rules Jan 23 18:30:38.016364 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 18:30:38.016665 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 18:30:38.061701 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 18:30:38.065335 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 23 18:30:38.068412 systemd[1]: Reached target time-set.target - System Time Set. Jan 23 18:30:38.075809 systemd-networkd[1578]: lo: Link UP Jan 23 18:30:38.076489 systemd-networkd[1578]: lo: Gained carrier Jan 23 18:30:38.081255 systemd-networkd[1578]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:30:38.081263 systemd-networkd[1578]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 18:30:38.081695 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 18:30:38.082867 systemd-networkd[1578]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:30:38.082873 systemd-networkd[1578]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 18:30:38.083565 systemd-networkd[1578]: eth1: Link UP Jan 23 18:30:38.084245 systemd[1]: Reached target network.target - Network. Jan 23 18:30:38.084570 systemd-networkd[1578]: eth0: Link UP Jan 23 18:30:38.085245 systemd-networkd[1578]: eth1: Gained carrier Jan 23 18:30:38.085566 systemd-networkd[1578]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:30:38.087771 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 23 18:30:38.091005 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 23 18:30:38.092586 systemd-networkd[1578]: eth0: Gained carrier Jan 23 18:30:38.093656 systemd-networkd[1578]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 18:30:38.112766 systemd-networkd[1578]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 23 18:30:38.115418 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 23 18:30:38.116145 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 23 18:30:38.154659 systemd-networkd[1578]: eth0: DHCPv4 address 46.62.169.9/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 23 18:30:38.157655 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 23 18:30:38.158477 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 23 18:30:38.564624 ldconfig[1574]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 23 18:30:38.570394 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 23 18:30:38.576890 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 23 18:30:38.608320 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 23 18:30:38.612027 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 18:30:38.613817 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 23 18:30:38.614736 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 23 18:30:38.615550 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 23 18:30:38.617084 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 23 18:30:38.620816 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 23 18:30:38.622798 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 23 18:30:38.623872 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 23 18:30:38.626368 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 23 18:30:38.627176 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 23 18:30:38.627233 systemd[1]: Reached target paths.target - Path Units. Jan 23 18:30:38.630283 systemd[1]: Reached target timers.target - Timer Units. Jan 23 18:30:38.633492 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 23 18:30:38.637452 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 23 18:30:38.643226 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 23 18:30:38.646259 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 23 18:30:38.648193 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 23 18:30:38.660844 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 23 18:30:38.662395 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 23 18:30:38.666542 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 23 18:30:38.671171 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 18:30:38.672023 systemd[1]: Reached target basic.target - Basic System. Jan 23 18:30:38.674502 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 23 18:30:38.674565 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 23 18:30:38.676703 systemd[1]: Starting containerd.service - containerd container runtime... Jan 23 18:30:38.681655 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 23 18:30:38.698144 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 23 18:30:38.706989 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 23 18:30:38.714845 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 23 18:30:38.717497 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 23 18:30:38.719593 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 23 18:30:38.725941 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 23 18:30:38.733920 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 23 18:30:38.737909 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 23 18:30:38.744175 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jan 23 18:30:38.752953 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 23 18:30:38.756113 coreos-metadata[1634]: Jan 23 18:30:38.755 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jan 23 18:30:38.760516 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 23 18:30:38.768956 coreos-metadata[1634]: Jan 23 18:30:38.768 INFO Fetch successful Jan 23 18:30:38.768956 coreos-metadata[1634]: Jan 23 18:30:38.768 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jan 23 18:30:38.771163 coreos-metadata[1634]: Jan 23 18:30:38.771 INFO Fetch successful Jan 23 18:30:38.775925 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 23 18:30:38.777110 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 23 18:30:38.778914 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 23 18:30:38.783981 systemd[1]: Starting update-engine.service - Update Engine... Jan 23 18:30:38.789005 jq[1639]: false Jan 23 18:30:38.789748 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 23 18:30:38.800012 google_oslogin_nss_cache[1641]: oslogin_cache_refresh[1641]: Refreshing passwd entry cache Jan 23 18:30:38.799383 oslogin_cache_refresh[1641]: Refreshing passwd entry cache Jan 23 18:30:38.804071 google_oslogin_nss_cache[1641]: oslogin_cache_refresh[1641]: Failure getting users, quitting Jan 23 18:30:38.804105 oslogin_cache_refresh[1641]: Failure getting users, quitting Jan 23 18:30:38.804155 google_oslogin_nss_cache[1641]: oslogin_cache_refresh[1641]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 23 18:30:38.804179 oslogin_cache_refresh[1641]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 23 18:30:38.804233 google_oslogin_nss_cache[1641]: oslogin_cache_refresh[1641]: Refreshing group entry cache Jan 23 18:30:38.804252 oslogin_cache_refresh[1641]: Refreshing group entry cache Jan 23 18:30:38.804874 google_oslogin_nss_cache[1641]: oslogin_cache_refresh[1641]: Failure getting groups, quitting Jan 23 18:30:38.804909 oslogin_cache_refresh[1641]: Failure getting groups, quitting Jan 23 18:30:38.804958 google_oslogin_nss_cache[1641]: oslogin_cache_refresh[1641]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 23 18:30:38.804976 oslogin_cache_refresh[1641]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 23 18:30:38.807474 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 23 18:30:38.809100 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 23 18:30:38.809303 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 23 18:30:38.809568 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 23 18:30:38.811360 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 23 18:30:38.815671 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 23 18:30:38.815877 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 23 18:30:38.832216 systemd[1]: motdgen.service: Deactivated successfully. Jan 23 18:30:38.832696 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 23 18:30:38.845084 jq[1652]: true Jan 23 18:30:38.853656 extend-filesystems[1640]: Found /dev/sda6 Jan 23 18:30:38.869617 extend-filesystems[1640]: Found /dev/sda9 Jan 23 18:30:38.872018 update_engine[1648]: I20260123 18:30:38.869951 1648 main.cc:92] Flatcar Update Engine starting Jan 23 18:30:38.872234 tar[1661]: linux-amd64/LICENSE Jan 23 18:30:38.873657 tar[1661]: linux-amd64/helm Jan 23 18:30:38.875964 extend-filesystems[1640]: Checking size of /dev/sda9 Jan 23 18:30:38.897303 jq[1682]: true Jan 23 18:30:38.891752 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 23 18:30:38.891499 dbus-daemon[1635]: [system] SELinux support is enabled Jan 23 18:30:38.897303 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 23 18:30:38.897324 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 23 18:30:38.898569 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 23 18:30:38.898586 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 23 18:30:38.923788 update_engine[1648]: I20260123 18:30:38.918005 1648 update_check_scheduler.cc:74] Next update check in 9m46s Jan 23 18:30:38.923851 extend-filesystems[1640]: Resized partition /dev/sda9 Jan 23 18:30:38.918498 systemd[1]: Started update-engine.service - Update Engine. Jan 23 18:30:38.922427 systemd-logind[1647]: New seat seat0. Jan 23 18:30:38.923422 systemd-logind[1647]: Watching system buttons on /dev/input/event3 (Power Button) Jan 23 18:30:38.923452 systemd-logind[1647]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 23 18:30:38.931848 extend-filesystems[1699]: resize2fs 1.47.3 (8-Jul-2025) Jan 23 18:30:38.938769 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 23 18:30:38.939294 systemd[1]: Started systemd-logind.service - User Login Management. Jan 23 18:30:38.950826 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 18410491 blocks Jan 23 18:30:38.958814 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 23 18:30:38.959397 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 23 18:30:39.042778 bash[1717]: Updated "/home/core/.ssh/authorized_keys" Jan 23 18:30:39.045031 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 23 18:30:39.049257 systemd[1]: Starting sshkeys.service... Jan 23 18:30:39.094280 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 23 18:30:39.098936 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 23 18:30:39.152668 sshd_keygen[1687]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 23 18:30:39.152857 systemd-networkd[1578]: eth0: Gained IPv6LL Jan 23 18:30:39.153252 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 23 18:30:39.160657 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 23 18:30:39.162012 systemd[1]: Reached target network-online.target - Network is Online. Jan 23 18:30:39.169189 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:30:39.174970 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 23 18:30:39.200055 containerd[1684]: time="2026-01-23T18:30:39Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 23 18:30:39.209923 containerd[1684]: time="2026-01-23T18:30:39.209897550Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 23 18:30:39.227247 coreos-metadata[1725]: Jan 23 18:30:39.227 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jan 23 18:30:39.228117 coreos-metadata[1725]: Jan 23 18:30:39.227 INFO Fetch successful Jan 23 18:30:39.231486 locksmithd[1698]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 23 18:30:39.232080 containerd[1684]: time="2026-01-23T18:30:39.231903079Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.03µs" Jan 23 18:30:39.232080 containerd[1684]: time="2026-01-23T18:30:39.231929549Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 23 18:30:39.232080 containerd[1684]: time="2026-01-23T18:30:39.231961299Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 23 18:30:39.232080 containerd[1684]: time="2026-01-23T18:30:39.231970969Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 23 18:30:39.234340 containerd[1684]: time="2026-01-23T18:30:39.234318580Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 23 18:30:39.234373 containerd[1684]: time="2026-01-23T18:30:39.234345780Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 18:30:39.234408 containerd[1684]: time="2026-01-23T18:30:39.234394810Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 18:30:39.234408 containerd[1684]: time="2026-01-23T18:30:39.234406490Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 18:30:39.234682 containerd[1684]: time="2026-01-23T18:30:39.234650710Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 18:30:39.234682 containerd[1684]: time="2026-01-23T18:30:39.234681380Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 18:30:39.234729 containerd[1684]: time="2026-01-23T18:30:39.234697380Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 18:30:39.234729 containerd[1684]: time="2026-01-23T18:30:39.234703710Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 23 18:30:39.235069 unknown[1725]: wrote ssh authorized keys file for user: core Jan 23 18:30:39.235386 containerd[1684]: time="2026-01-23T18:30:39.235122501Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 23 18:30:39.235386 containerd[1684]: time="2026-01-23T18:30:39.235159561Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 23 18:30:39.240067 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 23 18:30:39.244006 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 23 18:30:39.250092 containerd[1684]: time="2026-01-23T18:30:39.250060907Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 23 18:30:39.251940 containerd[1684]: time="2026-01-23T18:30:39.251797898Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 18:30:39.251940 containerd[1684]: time="2026-01-23T18:30:39.251830938Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 18:30:39.251940 containerd[1684]: time="2026-01-23T18:30:39.251838838Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 23 18:30:39.251940 containerd[1684]: time="2026-01-23T18:30:39.251901148Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 23 18:30:39.252765 containerd[1684]: time="2026-01-23T18:30:39.252748108Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 23 18:30:39.253865 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 23 18:30:39.254244 containerd[1684]: time="2026-01-23T18:30:39.253066848Z" level=info msg="metadata content store policy set" policy=shared Jan 23 18:30:39.269171 systemd[1]: issuegen.service: Deactivated successfully. Jan 23 18:30:39.269734 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 23 18:30:39.272851 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.276932068Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277029168Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277149878Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277159298Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277170548Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277179288Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277188788Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277204398Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277227778Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277238488Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277250858Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277258548Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277265718Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 23 18:30:39.277321 containerd[1684]: time="2026-01-23T18:30:39.277275668Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 23 18:30:39.297572 kernel: EXT4-fs (sda9): resized filesystem to 18410491 Jan 23 18:30:39.287001 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 23 18:30:39.290104 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 23 18:30:39.291963 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 23 18:30:39.292827 systemd[1]: Reached target getty.target - Login Prompts. Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298318917Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298717067Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298736887Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298772217Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298780367Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298788357Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298799097Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298808657Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298816927Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298824727Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298833427Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 23 18:30:39.298917 containerd[1684]: time="2026-01-23T18:30:39.298857537Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 23 18:30:39.299380 containerd[1684]: time="2026-01-23T18:30:39.299187477Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 23 18:30:39.299380 containerd[1684]: time="2026-01-23T18:30:39.299203807Z" level=info msg="Start snapshots syncer" Jan 23 18:30:39.299380 containerd[1684]: time="2026-01-23T18:30:39.299225747Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 23 18:30:39.299713 extend-filesystems[1699]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jan 23 18:30:39.299713 extend-filesystems[1699]: old_desc_blocks = 1, new_desc_blocks = 9 Jan 23 18:30:39.299713 extend-filesystems[1699]: The filesystem on /dev/sda9 is now 18410491 (4k) blocks long. Jan 23 18:30:39.305698 extend-filesystems[1640]: Resized filesystem in /dev/sda9 Jan 23 18:30:39.300786 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 23 18:30:39.314410 containerd[1684]: time="2026-01-23T18:30:39.299893708Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 23 18:30:39.314410 containerd[1684]: time="2026-01-23T18:30:39.299929488Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 23 18:30:39.314541 update-ssh-keys[1759]: Updated "/home/core/.ssh/authorized_keys" Jan 23 18:30:39.301684 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.301760908Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.302218809Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.302239689Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.302248609Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.302260359Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.302556669Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.302569819Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.302577879Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.302585489Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.302593789Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.305025400Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.305041550Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 18:30:39.315986 containerd[1684]: time="2026-01-23T18:30:39.305048920Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 18:30:39.309715 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.305074000Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.305081340Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.306683340Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.306695630Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.306706070Z" level=info msg="runtime interface created" Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.306710820Z" level=info msg="created NRI interface" Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.306716800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.307326881Z" level=info msg="Connect containerd service" Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.307359081Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 23 18:30:39.319124 containerd[1684]: time="2026-01-23T18:30:39.315672484Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 23 18:30:39.315864 systemd[1]: Finished sshkeys.service. Jan 23 18:30:39.343783 systemd-networkd[1578]: eth1: Gained IPv6LL Jan 23 18:30:39.344164 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 23 18:30:39.415626 containerd[1684]: time="2026-01-23T18:30:39.415558906Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 23 18:30:39.416385 containerd[1684]: time="2026-01-23T18:30:39.416202106Z" level=info msg="Start subscribing containerd event" Jan 23 18:30:39.416385 containerd[1684]: time="2026-01-23T18:30:39.416228636Z" level=info msg="Start recovering state" Jan 23 18:30:39.416385 containerd[1684]: time="2026-01-23T18:30:39.416302926Z" level=info msg="Start event monitor" Jan 23 18:30:39.416385 containerd[1684]: time="2026-01-23T18:30:39.416312016Z" level=info msg="Start cni network conf syncer for default" Jan 23 18:30:39.416385 containerd[1684]: time="2026-01-23T18:30:39.416317446Z" level=info msg="Start streaming server" Jan 23 18:30:39.416385 containerd[1684]: time="2026-01-23T18:30:39.416324506Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 23 18:30:39.416385 containerd[1684]: time="2026-01-23T18:30:39.416330346Z" level=info msg="runtime interface starting up..." Jan 23 18:30:39.416385 containerd[1684]: time="2026-01-23T18:30:39.416336446Z" level=info msg="starting plugins..." Jan 23 18:30:39.416385 containerd[1684]: time="2026-01-23T18:30:39.416348116Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 23 18:30:39.417195 containerd[1684]: time="2026-01-23T18:30:39.416975976Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 23 18:30:39.417278 systemd[1]: Started containerd.service - containerd container runtime. Jan 23 18:30:39.418842 containerd[1684]: time="2026-01-23T18:30:39.417089866Z" level=info msg="containerd successfully booted in 0.217376s" Jan 23 18:30:39.473654 tar[1661]: linux-amd64/README.md Jan 23 18:30:39.488554 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 23 18:30:40.542381 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:40.555296 (kubelet)[1792]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 18:30:40.563787 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 23 18:30:40.566246 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 23 18:30:40.571740 systemd[1]: Started sshd@0-46.62.169.9:22-4.153.228.146:41532.service - OpenSSH per-connection server daemon (4.153.228.146:41532). Jan 23 18:30:40.576330 systemd[1]: Startup finished in 3.618s (kernel) + 8.315s (initrd) + 5.045s (userspace) = 16.979s. Jan 23 18:30:41.275033 sshd[1794]: Accepted publickey for core from 4.153.228.146 port 41532 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:30:41.279425 sshd-session[1794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:41.294483 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 23 18:30:41.298995 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 23 18:30:41.311091 systemd-logind[1647]: New session 1 of user core. Jan 23 18:30:41.335381 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 23 18:30:41.344967 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 23 18:30:41.367879 (systemd)[1808]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:41.372657 systemd-logind[1647]: New session 2 of user core. Jan 23 18:30:41.440411 kubelet[1792]: E0123 18:30:41.440340 1792 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 18:30:41.446031 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 18:30:41.446489 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 18:30:41.448258 systemd[1]: kubelet.service: Consumed 1.407s CPU time, 265.8M memory peak. Jan 23 18:30:41.525059 systemd[1808]: Queued start job for default target default.target. Jan 23 18:30:41.531942 systemd[1808]: Created slice app.slice - User Application Slice. Jan 23 18:30:41.531966 systemd[1808]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 23 18:30:41.531977 systemd[1808]: Reached target paths.target - Paths. Jan 23 18:30:41.532135 systemd[1808]: Reached target timers.target - Timers. Jan 23 18:30:41.533490 systemd[1808]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 23 18:30:41.536727 systemd[1808]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 23 18:30:41.554272 systemd[1808]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 23 18:30:41.559110 systemd[1808]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 23 18:30:41.559216 systemd[1808]: Reached target sockets.target - Sockets. Jan 23 18:30:41.559365 systemd[1808]: Reached target basic.target - Basic System. Jan 23 18:30:41.559413 systemd[1808]: Reached target default.target - Main User Target. Jan 23 18:30:41.559442 systemd[1808]: Startup finished in 177ms. Jan 23 18:30:41.559635 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 23 18:30:41.564723 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 23 18:30:41.955180 systemd[1]: Started sshd@1-46.62.169.9:22-4.153.228.146:41540.service - OpenSSH per-connection server daemon (4.153.228.146:41540). Jan 23 18:30:42.638162 sshd[1823]: Accepted publickey for core from 4.153.228.146 port 41540 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:30:42.641046 sshd-session[1823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:42.651390 systemd-logind[1647]: New session 3 of user core. Jan 23 18:30:42.656869 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 23 18:30:43.013211 sshd[1827]: Connection closed by 4.153.228.146 port 41540 Jan 23 18:30:43.014111 sshd-session[1823]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:43.022899 systemd-logind[1647]: Session 3 logged out. Waiting for processes to exit. Jan 23 18:30:43.024184 systemd[1]: sshd@1-46.62.169.9:22-4.153.228.146:41540.service: Deactivated successfully. Jan 23 18:30:43.027515 systemd[1]: session-3.scope: Deactivated successfully. Jan 23 18:30:43.030262 systemd-logind[1647]: Removed session 3. Jan 23 18:30:43.148982 systemd[1]: Started sshd@2-46.62.169.9:22-4.153.228.146:41546.service - OpenSSH per-connection server daemon (4.153.228.146:41546). Jan 23 18:30:43.834009 sshd[1833]: Accepted publickey for core from 4.153.228.146 port 41546 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:30:43.835982 sshd-session[1833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:43.843676 systemd-logind[1647]: New session 4 of user core. Jan 23 18:30:43.856877 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 23 18:30:44.203866 sshd[1837]: Connection closed by 4.153.228.146 port 41546 Jan 23 18:30:44.205925 sshd-session[1833]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:44.212507 systemd[1]: sshd@2-46.62.169.9:22-4.153.228.146:41546.service: Deactivated successfully. Jan 23 18:30:44.216470 systemd[1]: session-4.scope: Deactivated successfully. Jan 23 18:30:44.220168 systemd-logind[1647]: Session 4 logged out. Waiting for processes to exit. Jan 23 18:30:44.222284 systemd-logind[1647]: Removed session 4. Jan 23 18:30:44.351026 systemd[1]: Started sshd@3-46.62.169.9:22-4.153.228.146:41548.service - OpenSSH per-connection server daemon (4.153.228.146:41548). Jan 23 18:30:45.035704 sshd[1843]: Accepted publickey for core from 4.153.228.146 port 41548 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:30:45.037935 sshd-session[1843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:45.048701 systemd-logind[1647]: New session 5 of user core. Jan 23 18:30:45.054911 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 23 18:30:45.410267 sshd[1847]: Connection closed by 4.153.228.146 port 41548 Jan 23 18:30:45.411949 sshd-session[1843]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:45.420111 systemd-logind[1647]: Session 5 logged out. Waiting for processes to exit. Jan 23 18:30:45.421589 systemd[1]: sshd@3-46.62.169.9:22-4.153.228.146:41548.service: Deactivated successfully. Jan 23 18:30:45.425345 systemd[1]: session-5.scope: Deactivated successfully. Jan 23 18:30:45.429365 systemd-logind[1647]: Removed session 5. Jan 23 18:30:45.545550 systemd[1]: Started sshd@4-46.62.169.9:22-4.153.228.146:40800.service - OpenSSH per-connection server daemon (4.153.228.146:40800). Jan 23 18:30:46.229105 sshd[1853]: Accepted publickey for core from 4.153.228.146 port 40800 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:30:46.231939 sshd-session[1853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:46.241685 systemd-logind[1647]: New session 6 of user core. Jan 23 18:30:46.249926 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 23 18:30:46.503250 sudo[1858]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 23 18:30:46.504309 sudo[1858]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 18:30:46.521931 sudo[1858]: pam_unix(sudo:session): session closed for user root Jan 23 18:30:46.644018 sshd[1857]: Connection closed by 4.153.228.146 port 40800 Jan 23 18:30:46.645942 sshd-session[1853]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:46.654556 systemd[1]: sshd@4-46.62.169.9:22-4.153.228.146:40800.service: Deactivated successfully. Jan 23 18:30:46.658323 systemd[1]: session-6.scope: Deactivated successfully. Jan 23 18:30:46.659869 systemd-logind[1647]: Session 6 logged out. Waiting for processes to exit. Jan 23 18:30:46.663008 systemd-logind[1647]: Removed session 6. Jan 23 18:30:46.792147 systemd[1]: Started sshd@5-46.62.169.9:22-4.153.228.146:40804.service - OpenSSH per-connection server daemon (4.153.228.146:40804). Jan 23 18:30:47.486677 sshd[1865]: Accepted publickey for core from 4.153.228.146 port 40804 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:30:47.489310 sshd-session[1865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:47.498691 systemd-logind[1647]: New session 7 of user core. Jan 23 18:30:47.503828 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 23 18:30:47.746014 sudo[1871]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 23 18:30:47.746846 sudo[1871]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 18:30:47.752045 sudo[1871]: pam_unix(sudo:session): session closed for user root Jan 23 18:30:47.765443 sudo[1870]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 23 18:30:47.766222 sudo[1870]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 18:30:47.781846 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 18:30:47.855693 kernel: kauditd_printk_skb: 189 callbacks suppressed Jan 23 18:30:47.855855 kernel: audit: type=1305 audit(1769193047.847:236): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 23 18:30:47.847000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 23 18:30:47.852926 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 18:30:47.856187 augenrules[1895]: No rules Jan 23 18:30:47.853452 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 18:30:47.847000 audit[1895]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffa70633a0 a2=420 a3=0 items=0 ppid=1876 pid=1895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:47.857161 sudo[1870]: pam_unix(sudo:session): session closed for user root Jan 23 18:30:47.873708 kernel: audit: type=1300 audit(1769193047.847:236): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffa70633a0 a2=420 a3=0 items=0 ppid=1876 pid=1895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:47.873831 kernel: audit: type=1327 audit(1769193047.847:236): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 18:30:47.847000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 18:30:47.855000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:47.855000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:47.890272 kernel: audit: type=1130 audit(1769193047.855:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:47.890373 kernel: audit: type=1131 audit(1769193047.855:238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:47.890415 kernel: audit: type=1106 audit(1769193047.856:239): pid=1870 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:47.856000 audit[1870]: USER_END pid=1870 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:47.856000 audit[1870]: CRED_DISP pid=1870 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:47.908654 kernel: audit: type=1104 audit(1769193047.856:240): pid=1870 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:47.982973 sshd[1869]: Connection closed by 4.153.228.146 port 40804 Jan 23 18:30:47.984895 sshd-session[1865]: pam_unix(sshd:session): session closed for user core Jan 23 18:30:47.986000 audit[1865]: USER_END pid=1865 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:30:48.000730 kernel: audit: type=1106 audit(1769193047.986:241): pid=1865 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:30:47.986000 audit[1865]: CRED_DISP pid=1865 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:30:48.010715 kernel: audit: type=1104 audit(1769193047.986:242): pid=1865 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:30:48.006495 systemd[1]: sshd@5-46.62.169.9:22-4.153.228.146:40804.service: Deactivated successfully. Jan 23 18:30:48.011022 systemd[1]: session-7.scope: Deactivated successfully. Jan 23 18:30:48.006000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-46.62.169.9:22-4.153.228.146:40804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:48.014121 systemd-logind[1647]: Session 7 logged out. Waiting for processes to exit. Jan 23 18:30:48.017106 systemd-logind[1647]: Removed session 7. Jan 23 18:30:48.020879 kernel: audit: type=1131 audit(1769193048.006:243): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-46.62.169.9:22-4.153.228.146:40804 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:48.125697 systemd[1]: Started sshd@6-46.62.169.9:22-4.153.228.146:40820.service - OpenSSH per-connection server daemon (4.153.228.146:40820). Jan 23 18:30:48.125000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-46.62.169.9:22-4.153.228.146:40820 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:48.805000 audit[1904]: USER_ACCT pid=1904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:30:48.806783 sshd[1904]: Accepted publickey for core from 4.153.228.146 port 40820 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:30:48.807000 audit[1904]: CRED_ACQ pid=1904 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:30:48.807000 audit[1904]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffedd1f2ef0 a2=3 a3=0 items=0 ppid=1 pid=1904 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:48.807000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:30:48.809454 sshd-session[1904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:30:48.819681 systemd-logind[1647]: New session 8 of user core. Jan 23 18:30:48.826913 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 23 18:30:48.832000 audit[1904]: USER_START pid=1904 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:30:48.835000 audit[1908]: CRED_ACQ pid=1908 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:30:49.065000 audit[1909]: USER_ACCT pid=1909 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:49.066174 sudo[1909]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 23 18:30:49.065000 audit[1909]: CRED_REFR pid=1909 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:49.067135 sudo[1909]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 18:30:49.066000 audit[1909]: USER_START pid=1909 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:30:49.684444 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 23 18:30:49.704206 (dockerd)[1928]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 23 18:30:50.175901 dockerd[1928]: time="2026-01-23T18:30:50.175702777Z" level=info msg="Starting up" Jan 23 18:30:50.177238 dockerd[1928]: time="2026-01-23T18:30:50.177191618Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 23 18:30:50.200046 dockerd[1928]: time="2026-01-23T18:30:50.199974937Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 23 18:30:50.226379 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport741313354-merged.mount: Deactivated successfully. Jan 23 18:30:50.274091 dockerd[1928]: time="2026-01-23T18:30:50.273990678Z" level=info msg="Loading containers: start." Jan 23 18:30:50.287686 kernel: Initializing XFRM netlink socket Jan 23 18:30:50.402000 audit[1977]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1977 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.402000 audit[1977]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffdc5721780 a2=0 a3=0 items=0 ppid=1928 pid=1977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.402000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 23 18:30:50.406000 audit[1979]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1979 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.406000 audit[1979]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7fff86cb77e0 a2=0 a3=0 items=0 ppid=1928 pid=1979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.406000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 23 18:30:50.411000 audit[1981]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1981 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.411000 audit[1981]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc1dc805c0 a2=0 a3=0 items=0 ppid=1928 pid=1981 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.411000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 23 18:30:50.416000 audit[1983]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1983 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.416000 audit[1983]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd7155b9a0 a2=0 a3=0 items=0 ppid=1928 pid=1983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.416000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 23 18:30:50.421000 audit[1985]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=1985 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.421000 audit[1985]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe63a50230 a2=0 a3=0 items=0 ppid=1928 pid=1985 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.421000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 23 18:30:50.426000 audit[1987]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=1987 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.426000 audit[1987]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd18ac4100 a2=0 a3=0 items=0 ppid=1928 pid=1987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.426000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 18:30:50.431000 audit[1989]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1989 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.431000 audit[1989]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffdc4e43a80 a2=0 a3=0 items=0 ppid=1928 pid=1989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.431000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 18:30:50.437000 audit[1991]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=1991 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.437000 audit[1991]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffdd6ba9ef0 a2=0 a3=0 items=0 ppid=1928 pid=1991 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.437000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 23 18:30:50.489000 audit[1994]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=1994 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.489000 audit[1994]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7ffd9cf2b4c0 a2=0 a3=0 items=0 ppid=1928 pid=1994 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.489000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 23 18:30:50.494000 audit[1996]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=1996 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.494000 audit[1996]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe2b579c40 a2=0 a3=0 items=0 ppid=1928 pid=1996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.494000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 23 18:30:50.499000 audit[1998]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1998 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.499000 audit[1998]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffe277cdec0 a2=0 a3=0 items=0 ppid=1928 pid=1998 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.499000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 23 18:30:50.504000 audit[2000]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2000 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.504000 audit[2000]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffcd0aefa10 a2=0 a3=0 items=0 ppid=1928 pid=2000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.504000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 18:30:50.509000 audit[2002]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2002 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.509000 audit[2002]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffc4912a8d0 a2=0 a3=0 items=0 ppid=1928 pid=2002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.509000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 23 18:30:50.596000 audit[2032]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2032 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.596000 audit[2032]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7fff23d81570 a2=0 a3=0 items=0 ppid=1928 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.596000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 23 18:30:50.601000 audit[2034]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.601000 audit[2034]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffe34e3f050 a2=0 a3=0 items=0 ppid=1928 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.601000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 23 18:30:50.606000 audit[2036]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2036 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.606000 audit[2036]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff7609d820 a2=0 a3=0 items=0 ppid=1928 pid=2036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.606000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 23 18:30:50.611000 audit[2038]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2038 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.611000 audit[2038]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe837e8010 a2=0 a3=0 items=0 ppid=1928 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.611000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 23 18:30:50.616000 audit[2040]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2040 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.616000 audit[2040]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe28888830 a2=0 a3=0 items=0 ppid=1928 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.616000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 23 18:30:50.620000 audit[2042]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2042 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.620000 audit[2042]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fffc5323500 a2=0 a3=0 items=0 ppid=1928 pid=2042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.620000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 18:30:50.625000 audit[2044]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2044 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.625000 audit[2044]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffd925065a0 a2=0 a3=0 items=0 ppid=1928 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.625000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 18:30:50.631000 audit[2046]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2046 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.631000 audit[2046]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffdc276a790 a2=0 a3=0 items=0 ppid=1928 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.631000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 23 18:30:50.636000 audit[2048]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2048 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.636000 audit[2048]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffe234dece0 a2=0 a3=0 items=0 ppid=1928 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.636000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 23 18:30:50.641000 audit[2050]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2050 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.641000 audit[2050]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd1c7acc40 a2=0 a3=0 items=0 ppid=1928 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.641000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 23 18:30:50.647000 audit[2052]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2052 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.647000 audit[2052]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffc75a833e0 a2=0 a3=0 items=0 ppid=1928 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.647000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 23 18:30:50.652000 audit[2054]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2054 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.652000 audit[2054]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffe26fec2e0 a2=0 a3=0 items=0 ppid=1928 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.652000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 18:30:50.657000 audit[2056]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2056 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.657000 audit[2056]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffc64d49ce0 a2=0 a3=0 items=0 ppid=1928 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.657000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 23 18:30:50.669000 audit[2061]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2061 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.669000 audit[2061]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff29740270 a2=0 a3=0 items=0 ppid=1928 pid=2061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.669000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 23 18:30:50.674000 audit[2063]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.674000 audit[2063]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffd82a76150 a2=0 a3=0 items=0 ppid=1928 pid=2063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.674000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 23 18:30:50.680000 audit[2065]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.680000 audit[2065]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffc74515a40 a2=0 a3=0 items=0 ppid=1928 pid=2065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.680000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 23 18:30:50.685000 audit[2067]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2067 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.685000 audit[2067]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fffda602ad0 a2=0 a3=0 items=0 ppid=1928 pid=2067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.685000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 23 18:30:50.690000 audit[2069]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2069 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.690000 audit[2069]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffec1c7cf30 a2=0 a3=0 items=0 ppid=1928 pid=2069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.690000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 23 18:30:50.695000 audit[2071]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2071 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:30:50.695000 audit[2071]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fff215c6980 a2=0 a3=0 items=0 ppid=1928 pid=2071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.695000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 23 18:30:50.709178 systemd-timesyncd[1580]: Network configuration changed, trying to establish connection. Jan 23 18:30:50.323725 systemd-timesyncd[1580]: Contacted time server 141.144.241.16:123 (2.flatcar.pool.ntp.org). Jan 23 18:30:50.338140 systemd-journald[1270]: Time jumped backwards, rotating. Jan 23 18:30:50.323832 systemd-timesyncd[1580]: Initial clock synchronization to Fri 2026-01-23 18:30:50.323386 UTC. Jan 23 18:30:50.324649 systemd-resolved[1326]: Clock change detected. Flushing caches. Jan 23 18:30:50.339000 audit[2076]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2076 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.339000 audit[2076]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffc3b937260 a2=0 a3=0 items=0 ppid=1928 pid=2076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.339000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 23 18:30:50.358000 audit[2079]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2079 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.358000 audit[2079]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffeec7c5f80 a2=0 a3=0 items=0 ppid=1928 pid=2079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.358000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 23 18:30:50.373000 audit[2087]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2087 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.373000 audit[2087]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffea17f03f0 a2=0 a3=0 items=0 ppid=1928 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.373000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 23 18:30:50.387000 audit[2093]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2093 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.387000 audit[2093]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffc187b70b0 a2=0 a3=0 items=0 ppid=1928 pid=2093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.387000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 23 18:30:50.391000 audit[2095]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2095 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.391000 audit[2095]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffdabda90b0 a2=0 a3=0 items=0 ppid=1928 pid=2095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.391000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 23 18:30:50.395000 audit[2097]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2097 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.395000 audit[2097]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fffe0732cd0 a2=0 a3=0 items=0 ppid=1928 pid=2097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.395000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 23 18:30:50.399000 audit[2099]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2099 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.399000 audit[2099]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffd341afa10 a2=0 a3=0 items=0 ppid=1928 pid=2099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.399000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 18:30:50.402000 audit[2101]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2101 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:30:50.402000 audit[2101]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffc25e4fb40 a2=0 a3=0 items=0 ppid=1928 pid=2101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:30:50.402000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 23 18:30:50.403879 systemd-networkd[1578]: docker0: Link UP Jan 23 18:30:50.409756 dockerd[1928]: time="2026-01-23T18:30:50.409699742Z" level=info msg="Loading containers: done." Jan 23 18:30:50.432585 dockerd[1928]: time="2026-01-23T18:30:50.432534092Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 23 18:30:50.432807 dockerd[1928]: time="2026-01-23T18:30:50.432626122Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 23 18:30:50.432807 dockerd[1928]: time="2026-01-23T18:30:50.432751652Z" level=info msg="Initializing buildkit" Jan 23 18:30:50.475168 dockerd[1928]: time="2026-01-23T18:30:50.475102699Z" level=info msg="Completed buildkit initialization" Jan 23 18:30:50.485000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:50.487445 dockerd[1928]: time="2026-01-23T18:30:50.486387004Z" level=info msg="Daemon has completed initialization" Jan 23 18:30:50.486713 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 23 18:30:50.488187 dockerd[1928]: time="2026-01-23T18:30:50.488094435Z" level=info msg="API listen on /run/docker.sock" Jan 23 18:30:50.802210 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2370937259-merged.mount: Deactivated successfully. Jan 23 18:30:51.278426 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 23 18:30:51.282267 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:30:51.507804 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:30:51.507000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:30:51.522536 (kubelet)[2149]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 18:30:51.598490 kubelet[2149]: E0123 18:30:51.598274 2149 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 18:30:51.608793 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 18:30:51.610405 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 18:30:51.610000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 18:30:51.611658 systemd[1]: kubelet.service: Consumed 271ms CPU time, 110.9M memory peak. Jan 23 18:30:51.965679 containerd[1684]: time="2026-01-23T18:30:51.965228510Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 23 18:30:52.618422 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2521767802.mount: Deactivated successfully. Jan 23 18:30:53.561186 containerd[1684]: time="2026-01-23T18:30:53.561126295Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:53.562554 containerd[1684]: time="2026-01-23T18:30:53.562359215Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=27401903" Jan 23 18:30:53.563521 containerd[1684]: time="2026-01-23T18:30:53.563496306Z" level=info msg="ImageCreate event name:\"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:53.566003 containerd[1684]: time="2026-01-23T18:30:53.565957637Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:53.566717 containerd[1684]: time="2026-01-23T18:30:53.566691317Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"29067246\" in 1.601409587s" Jan 23 18:30:53.566769 containerd[1684]: time="2026-01-23T18:30:53.566722797Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\"" Jan 23 18:30:53.567606 containerd[1684]: time="2026-01-23T18:30:53.567238137Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 23 18:30:54.679751 containerd[1684]: time="2026-01-23T18:30:54.679696530Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:54.680750 containerd[1684]: time="2026-01-23T18:30:54.680632641Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=24985199" Jan 23 18:30:54.681642 containerd[1684]: time="2026-01-23T18:30:54.681617501Z" level=info msg="ImageCreate event name:\"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:54.683580 containerd[1684]: time="2026-01-23T18:30:54.683555662Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:54.684178 containerd[1684]: time="2026-01-23T18:30:54.684160352Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"26650388\" in 1.116902425s" Jan 23 18:30:54.684228 containerd[1684]: time="2026-01-23T18:30:54.684218842Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\"" Jan 23 18:30:54.684809 containerd[1684]: time="2026-01-23T18:30:54.684787823Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 23 18:30:55.968713 containerd[1684]: time="2026-01-23T18:30:55.968659807Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:55.969566 containerd[1684]: time="2026-01-23T18:30:55.969519688Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=19396939" Jan 23 18:30:55.970713 containerd[1684]: time="2026-01-23T18:30:55.970629288Z" level=info msg="ImageCreate event name:\"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:55.972635 containerd[1684]: time="2026-01-23T18:30:55.972573039Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:55.973352 containerd[1684]: time="2026-01-23T18:30:55.973175659Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"21062128\" in 1.288365176s" Jan 23 18:30:55.973352 containerd[1684]: time="2026-01-23T18:30:55.973248459Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\"" Jan 23 18:30:55.973655 containerd[1684]: time="2026-01-23T18:30:55.973608659Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 23 18:30:57.116407 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1593708568.mount: Deactivated successfully. Jan 23 18:30:57.397265 containerd[1684]: time="2026-01-23T18:30:57.397163532Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:57.398297 containerd[1684]: time="2026-01-23T18:30:57.398262613Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=31158177" Jan 23 18:30:57.399170 containerd[1684]: time="2026-01-23T18:30:57.399135073Z" level=info msg="ImageCreate event name:\"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:57.400653 containerd[1684]: time="2026-01-23T18:30:57.400602694Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:57.400987 containerd[1684]: time="2026-01-23T18:30:57.400918144Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"31160918\" in 1.427272825s" Jan 23 18:30:57.400987 containerd[1684]: time="2026-01-23T18:30:57.400967744Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\"" Jan 23 18:30:57.401667 containerd[1684]: time="2026-01-23T18:30:57.401632974Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 23 18:30:57.909293 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3307054309.mount: Deactivated successfully. Jan 23 18:30:58.829060 containerd[1684]: time="2026-01-23T18:30:58.829017039Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:58.830343 containerd[1684]: time="2026-01-23T18:30:58.830282379Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=17569900" Jan 23 18:30:58.831157 containerd[1684]: time="2026-01-23T18:30:58.831122579Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:58.834328 containerd[1684]: time="2026-01-23T18:30:58.833576560Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:30:58.834328 containerd[1684]: time="2026-01-23T18:30:58.834214751Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.432559747s" Jan 23 18:30:58.834328 containerd[1684]: time="2026-01-23T18:30:58.834235641Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jan 23 18:30:58.834919 containerd[1684]: time="2026-01-23T18:30:58.834905691Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 23 18:30:59.297819 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2120671552.mount: Deactivated successfully. Jan 23 18:30:59.304845 containerd[1684]: time="2026-01-23T18:30:59.304733177Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 18:30:59.305939 containerd[1684]: time="2026-01-23T18:30:59.305867137Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 18:30:59.307371 containerd[1684]: time="2026-01-23T18:30:59.307293008Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 18:30:59.310407 containerd[1684]: time="2026-01-23T18:30:59.310329969Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 18:30:59.311714 containerd[1684]: time="2026-01-23T18:30:59.311282759Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 476.270258ms" Jan 23 18:30:59.311714 containerd[1684]: time="2026-01-23T18:30:59.311333009Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 23 18:30:59.312142 containerd[1684]: time="2026-01-23T18:30:59.312100510Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 23 18:30:59.833360 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2304617886.mount: Deactivated successfully. Jan 23 18:31:01.441883 containerd[1684]: time="2026-01-23T18:31:01.441840867Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:01.443007 containerd[1684]: time="2026-01-23T18:31:01.442890407Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=45502580" Jan 23 18:31:01.443998 containerd[1684]: time="2026-01-23T18:31:01.443949478Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:01.445997 containerd[1684]: time="2026-01-23T18:31:01.445853648Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:01.446523 containerd[1684]: time="2026-01-23T18:31:01.446405889Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.134265409s" Jan 23 18:31:01.446523 containerd[1684]: time="2026-01-23T18:31:01.446425549Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jan 23 18:31:01.859811 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 23 18:31:01.862715 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:31:02.066435 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:31:02.072046 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 23 18:31:02.072104 kernel: audit: type=1130 audit(1769193062.066:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:02.066000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:02.087218 (kubelet)[2357]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 18:31:02.135076 kubelet[2357]: E0123 18:31:02.134527 2357 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 18:31:02.139934 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 18:31:02.140383 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 18:31:02.140000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 18:31:02.141134 systemd[1]: kubelet.service: Consumed 207ms CPU time, 108.3M memory peak. Jan 23 18:31:02.152301 kernel: audit: type=1131 audit(1769193062.140:297): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 18:31:04.218000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:04.219693 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:31:04.219968 systemd[1]: kubelet.service: Consumed 207ms CPU time, 108.3M memory peak. Jan 23 18:31:04.231017 kernel: audit: type=1130 audit(1769193064.218:298): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:04.218000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:04.235352 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:31:04.238988 kernel: audit: type=1131 audit(1769193064.218:299): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:04.268036 systemd[1]: Reload requested from client PID 2384 ('systemctl') (unit session-8.scope)... Jan 23 18:31:04.268048 systemd[1]: Reloading... Jan 23 18:31:04.375050 zram_generator::config[2430]: No configuration found. Jan 23 18:31:04.554505 systemd[1]: Reloading finished in 286 ms. Jan 23 18:31:04.575121 kernel: audit: type=1334 audit(1769193064.568:300): prog-id=63 op=LOAD Jan 23 18:31:04.568000 audit: BPF prog-id=63 op=LOAD Jan 23 18:31:04.583037 kernel: audit: type=1334 audit(1769193064.568:301): prog-id=46 op=UNLOAD Jan 23 18:31:04.568000 audit: BPF prog-id=46 op=UNLOAD Jan 23 18:31:04.571000 audit: BPF prog-id=64 op=LOAD Jan 23 18:31:04.592070 kernel: audit: type=1334 audit(1769193064.571:302): prog-id=64 op=LOAD Jan 23 18:31:04.571000 audit: BPF prog-id=65 op=LOAD Jan 23 18:31:04.597442 kernel: audit: type=1334 audit(1769193064.571:303): prog-id=65 op=LOAD Jan 23 18:31:04.571000 audit: BPF prog-id=56 op=UNLOAD Jan 23 18:31:04.597674 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 23 18:31:04.597753 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 23 18:31:04.598043 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:31:04.598079 systemd[1]: kubelet.service: Consumed 126ms CPU time, 98.4M memory peak. Jan 23 18:31:04.601149 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:31:04.604994 kernel: audit: type=1334 audit(1769193064.571:304): prog-id=56 op=UNLOAD Jan 23 18:31:04.605038 kernel: audit: type=1334 audit(1769193064.571:305): prog-id=57 op=UNLOAD Jan 23 18:31:04.571000 audit: BPF prog-id=57 op=UNLOAD Jan 23 18:31:04.572000 audit: BPF prog-id=66 op=LOAD Jan 23 18:31:04.572000 audit: BPF prog-id=50 op=UNLOAD Jan 23 18:31:04.572000 audit: BPF prog-id=67 op=LOAD Jan 23 18:31:04.572000 audit: BPF prog-id=68 op=LOAD Jan 23 18:31:04.572000 audit: BPF prog-id=51 op=UNLOAD Jan 23 18:31:04.572000 audit: BPF prog-id=52 op=UNLOAD Jan 23 18:31:04.573000 audit: BPF prog-id=69 op=LOAD Jan 23 18:31:04.573000 audit: BPF prog-id=53 op=UNLOAD Jan 23 18:31:04.573000 audit: BPF prog-id=70 op=LOAD Jan 23 18:31:04.573000 audit: BPF prog-id=71 op=LOAD Jan 23 18:31:04.573000 audit: BPF prog-id=54 op=UNLOAD Jan 23 18:31:04.573000 audit: BPF prog-id=55 op=UNLOAD Jan 23 18:31:04.574000 audit: BPF prog-id=72 op=LOAD Jan 23 18:31:04.574000 audit: BPF prog-id=43 op=UNLOAD Jan 23 18:31:04.574000 audit: BPF prog-id=73 op=LOAD Jan 23 18:31:04.574000 audit: BPF prog-id=74 op=LOAD Jan 23 18:31:04.574000 audit: BPF prog-id=44 op=UNLOAD Jan 23 18:31:04.574000 audit: BPF prog-id=45 op=UNLOAD Jan 23 18:31:04.576000 audit: BPF prog-id=75 op=LOAD Jan 23 18:31:04.576000 audit: BPF prog-id=59 op=UNLOAD Jan 23 18:31:04.577000 audit: BPF prog-id=76 op=LOAD Jan 23 18:31:04.577000 audit: BPF prog-id=60 op=UNLOAD Jan 23 18:31:04.577000 audit: BPF prog-id=77 op=LOAD Jan 23 18:31:04.577000 audit: BPF prog-id=78 op=LOAD Jan 23 18:31:04.577000 audit: BPF prog-id=61 op=UNLOAD Jan 23 18:31:04.578000 audit: BPF prog-id=62 op=UNLOAD Jan 23 18:31:04.578000 audit: BPF prog-id=79 op=LOAD Jan 23 18:31:04.578000 audit: BPF prog-id=58 op=UNLOAD Jan 23 18:31:04.579000 audit: BPF prog-id=80 op=LOAD Jan 23 18:31:04.579000 audit: BPF prog-id=47 op=UNLOAD Jan 23 18:31:04.579000 audit: BPF prog-id=81 op=LOAD Jan 23 18:31:04.579000 audit: BPF prog-id=82 op=LOAD Jan 23 18:31:04.579000 audit: BPF prog-id=48 op=UNLOAD Jan 23 18:31:04.579000 audit: BPF prog-id=49 op=UNLOAD Jan 23 18:31:04.597000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 18:31:04.783629 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:31:04.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:04.797592 (kubelet)[2483]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 18:31:04.848260 kubelet[2483]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 18:31:04.848260 kubelet[2483]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 18:31:04.848260 kubelet[2483]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 18:31:04.849422 kubelet[2483]: I0123 18:31:04.849357 2483 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 18:31:05.122161 kubelet[2483]: I0123 18:31:05.122065 2483 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 23 18:31:05.122161 kubelet[2483]: I0123 18:31:05.122103 2483 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 18:31:05.122452 kubelet[2483]: I0123 18:31:05.122429 2483 server.go:954] "Client rotation is on, will bootstrap in background" Jan 23 18:31:05.146703 kubelet[2483]: I0123 18:31:05.146313 2483 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 18:31:05.147809 kubelet[2483]: E0123 18:31:05.147771 2483 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://46.62.169.9:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 46.62.169.9:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:31:05.156175 kubelet[2483]: I0123 18:31:05.155411 2483 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 18:31:05.160012 kubelet[2483]: I0123 18:31:05.159317 2483 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 23 18:31:05.160743 kubelet[2483]: I0123 18:31:05.160695 2483 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 18:31:05.160928 kubelet[2483]: I0123 18:31:05.160737 2483 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-1-0-c-e2d32aff86","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 18:31:05.160928 kubelet[2483]: I0123 18:31:05.160929 2483 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 18:31:05.161161 kubelet[2483]: I0123 18:31:05.160936 2483 container_manager_linux.go:304] "Creating device plugin manager" Jan 23 18:31:05.161161 kubelet[2483]: I0123 18:31:05.161059 2483 state_mem.go:36] "Initialized new in-memory state store" Jan 23 18:31:05.164453 kubelet[2483]: I0123 18:31:05.164418 2483 kubelet.go:446] "Attempting to sync node with API server" Jan 23 18:31:05.164531 kubelet[2483]: I0123 18:31:05.164461 2483 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 18:31:05.164531 kubelet[2483]: I0123 18:31:05.164478 2483 kubelet.go:352] "Adding apiserver pod source" Jan 23 18:31:05.164531 kubelet[2483]: I0123 18:31:05.164486 2483 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 18:31:05.168896 kubelet[2483]: I0123 18:31:05.168862 2483 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 23 18:31:05.169224 kubelet[2483]: I0123 18:31:05.169197 2483 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 23 18:31:05.169815 kubelet[2483]: W0123 18:31:05.169789 2483 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 23 18:31:05.171997 kubelet[2483]: I0123 18:31:05.171256 2483 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 23 18:31:05.171997 kubelet[2483]: I0123 18:31:05.171285 2483 server.go:1287] "Started kubelet" Jan 23 18:31:05.171997 kubelet[2483]: W0123 18:31:05.171369 2483 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://46.62.169.9:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 46.62.169.9:6443: connect: connection refused Jan 23 18:31:05.171997 kubelet[2483]: E0123 18:31:05.171422 2483 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://46.62.169.9:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.62.169.9:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:31:05.171997 kubelet[2483]: W0123 18:31:05.171468 2483 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://46.62.169.9:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-1-0-c-e2d32aff86&limit=500&resourceVersion=0": dial tcp 46.62.169.9:6443: connect: connection refused Jan 23 18:31:05.171997 kubelet[2483]: E0123 18:31:05.171493 2483 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://46.62.169.9:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-1-0-c-e2d32aff86&limit=500&resourceVersion=0\": dial tcp 46.62.169.9:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:31:05.178138 kubelet[2483]: I0123 18:31:05.178092 2483 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 18:31:05.179117 kubelet[2483]: I0123 18:31:05.179056 2483 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 18:31:05.179117 kubelet[2483]: I0123 18:31:05.179105 2483 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 18:31:05.180118 kubelet[2483]: I0123 18:31:05.179672 2483 server.go:479] "Adding debug handlers to kubelet server" Jan 23 18:31:05.181370 kubelet[2483]: I0123 18:31:05.181323 2483 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 18:31:05.181515 kubelet[2483]: I0123 18:31:05.181492 2483 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 18:31:05.183107 kubelet[2483]: E0123 18:31:05.182181 2483 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://46.62.169.9:6443/api/v1/namespaces/default/events\": dial tcp 46.62.169.9:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-1-0-c-e2d32aff86.188d6fb81c9e3786 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-1-0-c-e2d32aff86,UID:ci-4547-1-0-c-e2d32aff86,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-1-0-c-e2d32aff86,},FirstTimestamp:2026-01-23 18:31:05.17126951 +0000 UTC m=+0.367001674,LastTimestamp:2026-01-23 18:31:05.17126951 +0000 UTC m=+0.367001674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-1-0-c-e2d32aff86,}" Jan 23 18:31:05.185163 kubelet[2483]: E0123 18:31:05.185139 2483 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 18:31:05.184000 audit[2494]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2494 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:05.184000 audit[2494]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff1aaa66e0 a2=0 a3=0 items=0 ppid=2483 pid=2494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.184000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 23 18:31:05.187225 kubelet[2483]: I0123 18:31:05.187200 2483 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 23 18:31:05.187431 kubelet[2483]: I0123 18:31:05.187417 2483 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 23 18:31:05.187523 kubelet[2483]: I0123 18:31:05.187452 2483 reconciler.go:26] "Reconciler: start to sync state" Jan 23 18:31:05.187657 kubelet[2483]: W0123 18:31:05.187625 2483 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://46.62.169.9:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 46.62.169.9:6443: connect: connection refused Jan 23 18:31:05.187739 kubelet[2483]: E0123 18:31:05.187655 2483 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://46.62.169.9:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.62.169.9:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:31:05.188888 kubelet[2483]: I0123 18:31:05.188869 2483 factory.go:221] Registration of the containerd container factory successfully Jan 23 18:31:05.188888 kubelet[2483]: I0123 18:31:05.188879 2483 factory.go:221] Registration of the systemd container factory successfully Jan 23 18:31:05.189086 kubelet[2483]: I0123 18:31:05.188930 2483 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 18:31:05.189000 audit[2495]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2495 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:05.189000 audit[2495]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc050bacb0 a2=0 a3=0 items=0 ppid=2483 pid=2495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.189000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 23 18:31:05.193512 kubelet[2483]: E0123 18:31:05.193480 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:05.193000 audit[2497]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2497 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:05.193000 audit[2497]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd75a26980 a2=0 a3=0 items=0 ppid=2483 pid=2497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.193000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 18:31:05.196000 audit[2499]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2499 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:05.196000 audit[2499]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffd9ada3920 a2=0 a3=0 items=0 ppid=2483 pid=2499 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.196000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 18:31:05.203000 audit[2502]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2502 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:05.203000 audit[2502]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffc889d1d70 a2=0 a3=0 items=0 ppid=2483 pid=2502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.203000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 23 18:31:05.205167 kubelet[2483]: I0123 18:31:05.204658 2483 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 23 18:31:05.204000 audit[2503]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2503 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:05.204000 audit[2503]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd1990af80 a2=0 a3=0 items=0 ppid=2483 pid=2503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.204000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 23 18:31:05.205897 kubelet[2483]: I0123 18:31:05.205831 2483 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 23 18:31:05.205897 kubelet[2483]: I0123 18:31:05.205843 2483 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 23 18:31:05.205897 kubelet[2483]: I0123 18:31:05.205856 2483 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 18:31:05.205897 kubelet[2483]: I0123 18:31:05.205862 2483 kubelet.go:2382] "Starting kubelet main sync loop" Jan 23 18:31:05.205897 kubelet[2483]: E0123 18:31:05.205895 2483 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 18:31:05.205000 audit[2504]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2504 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:05.205000 audit[2504]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffea55e89b0 a2=0 a3=0 items=0 ppid=2483 pid=2504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.205000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 23 18:31:05.210034 kubelet[2483]: E0123 18:31:05.209597 2483 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.62.169.9:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-c-e2d32aff86?timeout=10s\": dial tcp 46.62.169.9:6443: connect: connection refused" interval="200ms" Jan 23 18:31:05.209000 audit[2509]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2509 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:05.209000 audit[2509]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffccb4832a0 a2=0 a3=0 items=0 ppid=2483 pid=2509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.209000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 23 18:31:05.211000 audit[2510]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_chain pid=2510 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:05.211000 audit[2510]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff9fa83cb0 a2=0 a3=0 items=0 ppid=2483 pid=2510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.211000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 23 18:31:05.212854 kubelet[2483]: W0123 18:31:05.211904 2483 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://46.62.169.9:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 46.62.169.9:6443: connect: connection refused Jan 23 18:31:05.212854 kubelet[2483]: E0123 18:31:05.212586 2483 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://46.62.169.9:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.62.169.9:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:31:05.215000 audit[2507]: NETFILTER_CFG table=mangle:51 family=10 entries=1 op=nft_register_chain pid=2507 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:05.215000 audit[2507]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffc4da2560 a2=0 a3=0 items=0 ppid=2483 pid=2507 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.215000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 23 18:31:05.217000 audit[2514]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2514 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:05.217000 audit[2514]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc7c755350 a2=0 a3=0 items=0 ppid=2483 pid=2514 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.217000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 23 18:31:05.218000 audit[2515]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2515 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:05.218000 audit[2515]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd8cc7bf40 a2=0 a3=0 items=0 ppid=2483 pid=2515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.218000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 23 18:31:05.221322 kubelet[2483]: I0123 18:31:05.221293 2483 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 18:31:05.221322 kubelet[2483]: I0123 18:31:05.221303 2483 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 18:31:05.221322 kubelet[2483]: I0123 18:31:05.221315 2483 state_mem.go:36] "Initialized new in-memory state store" Jan 23 18:31:05.223577 kubelet[2483]: I0123 18:31:05.223543 2483 policy_none.go:49] "None policy: Start" Jan 23 18:31:05.223577 kubelet[2483]: I0123 18:31:05.223557 2483 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 23 18:31:05.223577 kubelet[2483]: I0123 18:31:05.223566 2483 state_mem.go:35] "Initializing new in-memory state store" Jan 23 18:31:05.230696 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 23 18:31:05.244044 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 23 18:31:05.248216 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 23 18:31:05.256572 kubelet[2483]: I0123 18:31:05.256538 2483 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 23 18:31:05.257226 kubelet[2483]: I0123 18:31:05.256688 2483 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 18:31:05.257226 kubelet[2483]: I0123 18:31:05.256699 2483 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 18:31:05.257919 kubelet[2483]: I0123 18:31:05.257755 2483 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 18:31:05.259048 kubelet[2483]: E0123 18:31:05.258999 2483 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 18:31:05.259112 kubelet[2483]: E0123 18:31:05.259055 2483 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:05.324755 systemd[1]: Created slice kubepods-burstable-pod4570ba94197a9e72761517edb8aa395e.slice - libcontainer container kubepods-burstable-pod4570ba94197a9e72761517edb8aa395e.slice. Jan 23 18:31:05.350002 kubelet[2483]: E0123 18:31:05.349914 2483 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.354527 systemd[1]: Created slice kubepods-burstable-podf579a4e40d79e63645203ec87bc306f5.slice - libcontainer container kubepods-burstable-podf579a4e40d79e63645203ec87bc306f5.slice. Jan 23 18:31:05.359700 kubelet[2483]: I0123 18:31:05.359305 2483 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.359849 kubelet[2483]: E0123 18:31:05.359787 2483 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.62.169.9:6443/api/v1/nodes\": dial tcp 46.62.169.9:6443: connect: connection refused" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.364331 kubelet[2483]: E0123 18:31:05.364263 2483 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.369827 systemd[1]: Created slice kubepods-burstable-pod08048b8587436a94c660c878372e5eea.slice - libcontainer container kubepods-burstable-pod08048b8587436a94c660c878372e5eea.slice. Jan 23 18:31:05.373589 kubelet[2483]: E0123 18:31:05.373488 2483 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.388798 kubelet[2483]: I0123 18:31:05.388723 2483 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-ca-certs\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.388798 kubelet[2483]: I0123 18:31:05.388768 2483 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-kubeconfig\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.388798 kubelet[2483]: I0123 18:31:05.388800 2483 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.389120 kubelet[2483]: I0123 18:31:05.388826 2483 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4570ba94197a9e72761517edb8aa395e-ca-certs\") pod \"kube-apiserver-ci-4547-1-0-c-e2d32aff86\" (UID: \"4570ba94197a9e72761517edb8aa395e\") " pod="kube-system/kube-apiserver-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.389120 kubelet[2483]: I0123 18:31:05.388880 2483 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4570ba94197a9e72761517edb8aa395e-k8s-certs\") pod \"kube-apiserver-ci-4547-1-0-c-e2d32aff86\" (UID: \"4570ba94197a9e72761517edb8aa395e\") " pod="kube-system/kube-apiserver-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.389120 kubelet[2483]: I0123 18:31:05.388905 2483 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4570ba94197a9e72761517edb8aa395e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-1-0-c-e2d32aff86\" (UID: \"4570ba94197a9e72761517edb8aa395e\") " pod="kube-system/kube-apiserver-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.389120 kubelet[2483]: I0123 18:31:05.388929 2483 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.389120 kubelet[2483]: I0123 18:31:05.388951 2483 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-k8s-certs\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.389292 kubelet[2483]: I0123 18:31:05.389010 2483 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/08048b8587436a94c660c878372e5eea-kubeconfig\") pod \"kube-scheduler-ci-4547-1-0-c-e2d32aff86\" (UID: \"08048b8587436a94c660c878372e5eea\") " pod="kube-system/kube-scheduler-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.410617 kubelet[2483]: E0123 18:31:05.410529 2483 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.62.169.9:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-c-e2d32aff86?timeout=10s\": dial tcp 46.62.169.9:6443: connect: connection refused" interval="400ms" Jan 23 18:31:05.563968 kubelet[2483]: I0123 18:31:05.563897 2483 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.564471 kubelet[2483]: E0123 18:31:05.564418 2483 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.62.169.9:6443/api/v1/nodes\": dial tcp 46.62.169.9:6443: connect: connection refused" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.652830 containerd[1684]: time="2026-01-23T18:31:05.652659391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-1-0-c-e2d32aff86,Uid:4570ba94197a9e72761517edb8aa395e,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:05.665357 containerd[1684]: time="2026-01-23T18:31:05.665291956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-1-0-c-e2d32aff86,Uid:f579a4e40d79e63645203ec87bc306f5,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:05.692536 containerd[1684]: time="2026-01-23T18:31:05.692364117Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-1-0-c-e2d32aff86,Uid:08048b8587436a94c660c878372e5eea,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:05.697310 containerd[1684]: time="2026-01-23T18:31:05.697242999Z" level=info msg="connecting to shim 521e94faf5fe9adba87f7326e36f357d743e550ac78739032598f9c0dbefc009" address="unix:///run/containerd/s/76a3d74e70925c9c256644a760b4e2a706d297d2e4e8266f2478e8f1167b88a1" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:05.756059 containerd[1684]: time="2026-01-23T18:31:05.755911844Z" level=info msg="connecting to shim 122948b6e8b4fe1f4fbad1128bb1e7b50757b209e120520b5b23c8b3b3d015dc" address="unix:///run/containerd/s/063034fe046166f4557eac37cab47473d2a5b3ecaa810e6d23d522aa7e27c73c" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:05.759467 systemd[1]: Started cri-containerd-521e94faf5fe9adba87f7326e36f357d743e550ac78739032598f9c0dbefc009.scope - libcontainer container 521e94faf5fe9adba87f7326e36f357d743e550ac78739032598f9c0dbefc009. Jan 23 18:31:05.803816 containerd[1684]: time="2026-01-23T18:31:05.803539683Z" level=info msg="connecting to shim bb7960dc02fea21ad0fcf26cf8e1de2be2a94cc06c246024c736886d9737c19a" address="unix:///run/containerd/s/37559ca7bd6eb011ab10a6f4523370c88f7cd22c0df7658fef01a07f7971f05b" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:05.807000 audit: BPF prog-id=83 op=LOAD Jan 23 18:31:05.808000 audit: BPF prog-id=84 op=LOAD Jan 23 18:31:05.808000 audit[2537]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2524 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.808000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532316539346661663566653961646261383766373332366533366633 Jan 23 18:31:05.808000 audit: BPF prog-id=84 op=UNLOAD Jan 23 18:31:05.808000 audit[2537]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2524 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.808000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532316539346661663566653961646261383766373332366533366633 Jan 23 18:31:05.809000 audit: BPF prog-id=85 op=LOAD Jan 23 18:31:05.809000 audit[2537]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2524 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.811241 kubelet[2483]: E0123 18:31:05.811184 2483 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.62.169.9:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-c-e2d32aff86?timeout=10s\": dial tcp 46.62.169.9:6443: connect: connection refused" interval="800ms" Jan 23 18:31:05.809000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532316539346661663566653961646261383766373332366533366633 Jan 23 18:31:05.810000 audit: BPF prog-id=86 op=LOAD Jan 23 18:31:05.810000 audit[2537]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2524 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.810000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532316539346661663566653961646261383766373332366533366633 Jan 23 18:31:05.812000 audit: BPF prog-id=86 op=UNLOAD Jan 23 18:31:05.812000 audit[2537]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2524 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.812000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532316539346661663566653961646261383766373332366533366633 Jan 23 18:31:05.812000 audit: BPF prog-id=85 op=UNLOAD Jan 23 18:31:05.812000 audit[2537]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2524 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.812000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532316539346661663566653961646261383766373332366533366633 Jan 23 18:31:05.812000 audit: BPF prog-id=87 op=LOAD Jan 23 18:31:05.812000 audit[2537]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2524 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.812000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532316539346661663566653961646261383766373332366533366633 Jan 23 18:31:05.823211 systemd[1]: Started cri-containerd-122948b6e8b4fe1f4fbad1128bb1e7b50757b209e120520b5b23c8b3b3d015dc.scope - libcontainer container 122948b6e8b4fe1f4fbad1128bb1e7b50757b209e120520b5b23c8b3b3d015dc. Jan 23 18:31:05.835211 systemd[1]: Started cri-containerd-bb7960dc02fea21ad0fcf26cf8e1de2be2a94cc06c246024c736886d9737c19a.scope - libcontainer container bb7960dc02fea21ad0fcf26cf8e1de2be2a94cc06c246024c736886d9737c19a. Jan 23 18:31:05.842000 audit: BPF prog-id=88 op=LOAD Jan 23 18:31:05.842000 audit: BPF prog-id=89 op=LOAD Jan 23 18:31:05.842000 audit[2581]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=2558 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.842000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323934386236653862346665316634666261643131323862623165 Jan 23 18:31:05.842000 audit: BPF prog-id=89 op=UNLOAD Jan 23 18:31:05.842000 audit[2581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2558 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.842000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323934386236653862346665316634666261643131323862623165 Jan 23 18:31:05.842000 audit: BPF prog-id=90 op=LOAD Jan 23 18:31:05.842000 audit[2581]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2558 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.842000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323934386236653862346665316634666261643131323862623165 Jan 23 18:31:05.842000 audit: BPF prog-id=91 op=LOAD Jan 23 18:31:05.842000 audit[2581]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2558 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.842000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323934386236653862346665316634666261643131323862623165 Jan 23 18:31:05.842000 audit: BPF prog-id=91 op=UNLOAD Jan 23 18:31:05.842000 audit[2581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2558 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.842000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323934386236653862346665316634666261643131323862623165 Jan 23 18:31:05.842000 audit: BPF prog-id=90 op=UNLOAD Jan 23 18:31:05.842000 audit[2581]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2558 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.842000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323934386236653862346665316634666261643131323862623165 Jan 23 18:31:05.842000 audit: BPF prog-id=92 op=LOAD Jan 23 18:31:05.842000 audit[2581]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2558 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.842000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3132323934386236653862346665316634666261643131323862623165 Jan 23 18:31:05.850000 audit: BPF prog-id=93 op=LOAD Jan 23 18:31:05.850000 audit: BPF prog-id=94 op=LOAD Jan 23 18:31:05.850000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=2593 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262373936306463303266656132316164306663663236636638653164 Jan 23 18:31:05.850000 audit: BPF prog-id=94 op=UNLOAD Jan 23 18:31:05.850000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262373936306463303266656132316164306663663236636638653164 Jan 23 18:31:05.850000 audit: BPF prog-id=95 op=LOAD Jan 23 18:31:05.850000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=2593 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262373936306463303266656132316164306663663236636638653164 Jan 23 18:31:05.850000 audit: BPF prog-id=96 op=LOAD Jan 23 18:31:05.850000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=2593 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262373936306463303266656132316164306663663236636638653164 Jan 23 18:31:05.850000 audit: BPF prog-id=96 op=UNLOAD Jan 23 18:31:05.850000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262373936306463303266656132316164306663663236636638653164 Jan 23 18:31:05.850000 audit: BPF prog-id=95 op=UNLOAD Jan 23 18:31:05.850000 audit[2610]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262373936306463303266656132316164306663663236636638653164 Jan 23 18:31:05.850000 audit: BPF prog-id=97 op=LOAD Jan 23 18:31:05.850000 audit[2610]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=2593 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.850000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262373936306463303266656132316164306663663236636638653164 Jan 23 18:31:05.888907 containerd[1684]: time="2026-01-23T18:31:05.888859679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-1-0-c-e2d32aff86,Uid:4570ba94197a9e72761517edb8aa395e,Namespace:kube-system,Attempt:0,} returns sandbox id \"521e94faf5fe9adba87f7326e36f357d743e550ac78739032598f9c0dbefc009\"" Jan 23 18:31:05.889435 containerd[1684]: time="2026-01-23T18:31:05.889422189Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-1-0-c-e2d32aff86,Uid:08048b8587436a94c660c878372e5eea,Namespace:kube-system,Attempt:0,} returns sandbox id \"bb7960dc02fea21ad0fcf26cf8e1de2be2a94cc06c246024c736886d9737c19a\"" Jan 23 18:31:05.890996 containerd[1684]: time="2026-01-23T18:31:05.890967580Z" level=info msg="CreateContainer within sandbox \"521e94faf5fe9adba87f7326e36f357d743e550ac78739032598f9c0dbefc009\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 23 18:31:05.892045 containerd[1684]: time="2026-01-23T18:31:05.892004510Z" level=info msg="CreateContainer within sandbox \"bb7960dc02fea21ad0fcf26cf8e1de2be2a94cc06c246024c736886d9737c19a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 23 18:31:05.900454 containerd[1684]: time="2026-01-23T18:31:05.900428594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-1-0-c-e2d32aff86,Uid:f579a4e40d79e63645203ec87bc306f5,Namespace:kube-system,Attempt:0,} returns sandbox id \"122948b6e8b4fe1f4fbad1128bb1e7b50757b209e120520b5b23c8b3b3d015dc\"" Jan 23 18:31:05.901969 containerd[1684]: time="2026-01-23T18:31:05.901913714Z" level=info msg="CreateContainer within sandbox \"122948b6e8b4fe1f4fbad1128bb1e7b50757b209e120520b5b23c8b3b3d015dc\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 23 18:31:05.903113 containerd[1684]: time="2026-01-23T18:31:05.903065155Z" level=info msg="Container b6569ded53a57e06bd5bf4ec0613d5adf7f494e587c38b56cdfb8453b7385dd6: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:05.905580 containerd[1684]: time="2026-01-23T18:31:05.905565486Z" level=info msg="Container 97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:05.909595 containerd[1684]: time="2026-01-23T18:31:05.909574478Z" level=info msg="CreateContainer within sandbox \"521e94faf5fe9adba87f7326e36f357d743e550ac78739032598f9c0dbefc009\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b6569ded53a57e06bd5bf4ec0613d5adf7f494e587c38b56cdfb8453b7385dd6\"" Jan 23 18:31:05.909880 containerd[1684]: time="2026-01-23T18:31:05.909851888Z" level=info msg="StartContainer for \"b6569ded53a57e06bd5bf4ec0613d5adf7f494e587c38b56cdfb8453b7385dd6\"" Jan 23 18:31:05.911052 containerd[1684]: time="2026-01-23T18:31:05.911019378Z" level=info msg="connecting to shim b6569ded53a57e06bd5bf4ec0613d5adf7f494e587c38b56cdfb8453b7385dd6" address="unix:///run/containerd/s/76a3d74e70925c9c256644a760b4e2a706d297d2e4e8266f2478e8f1167b88a1" protocol=ttrpc version=3 Jan 23 18:31:05.913540 containerd[1684]: time="2026-01-23T18:31:05.913518159Z" level=info msg="CreateContainer within sandbox \"bb7960dc02fea21ad0fcf26cf8e1de2be2a94cc06c246024c736886d9737c19a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565\"" Jan 23 18:31:05.913765 containerd[1684]: time="2026-01-23T18:31:05.913749009Z" level=info msg="StartContainer for \"97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565\"" Jan 23 18:31:05.916046 containerd[1684]: time="2026-01-23T18:31:05.915929660Z" level=info msg="connecting to shim 97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565" address="unix:///run/containerd/s/37559ca7bd6eb011ab10a6f4523370c88f7cd22c0df7658fef01a07f7971f05b" protocol=ttrpc version=3 Jan 23 18:31:05.916206 containerd[1684]: time="2026-01-23T18:31:05.916181870Z" level=info msg="Container 34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:05.932124 systemd[1]: Started cri-containerd-b6569ded53a57e06bd5bf4ec0613d5adf7f494e587c38b56cdfb8453b7385dd6.scope - libcontainer container b6569ded53a57e06bd5bf4ec0613d5adf7f494e587c38b56cdfb8453b7385dd6. Jan 23 18:31:05.940004 containerd[1684]: time="2026-01-23T18:31:05.939953830Z" level=info msg="CreateContainer within sandbox \"122948b6e8b4fe1f4fbad1128bb1e7b50757b209e120520b5b23c8b3b3d015dc\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6\"" Jan 23 18:31:05.940479 containerd[1684]: time="2026-01-23T18:31:05.940465270Z" level=info msg="StartContainer for \"34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6\"" Jan 23 18:31:05.941197 systemd[1]: Started cri-containerd-97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565.scope - libcontainer container 97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565. Jan 23 18:31:05.941450 containerd[1684]: time="2026-01-23T18:31:05.941412011Z" level=info msg="connecting to shim 34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6" address="unix:///run/containerd/s/063034fe046166f4557eac37cab47473d2a5b3ecaa810e6d23d522aa7e27c73c" protocol=ttrpc version=3 Jan 23 18:31:05.951000 audit: BPF prog-id=98 op=LOAD Jan 23 18:31:05.953000 audit: BPF prog-id=99 op=LOAD Jan 23 18:31:05.953000 audit[2658]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2524 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353639646564353361353765303662643562663465633036313364 Jan 23 18:31:05.953000 audit: BPF prog-id=99 op=UNLOAD Jan 23 18:31:05.953000 audit[2658]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2524 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353639646564353361353765303662643562663465633036313364 Jan 23 18:31:05.953000 audit: BPF prog-id=100 op=LOAD Jan 23 18:31:05.953000 audit[2658]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2524 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353639646564353361353765303662643562663465633036313364 Jan 23 18:31:05.953000 audit: BPF prog-id=101 op=LOAD Jan 23 18:31:05.953000 audit[2658]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2524 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353639646564353361353765303662643562663465633036313364 Jan 23 18:31:05.953000 audit: BPF prog-id=101 op=UNLOAD Jan 23 18:31:05.953000 audit[2658]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2524 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353639646564353361353765303662643562663465633036313364 Jan 23 18:31:05.953000 audit: BPF prog-id=100 op=UNLOAD Jan 23 18:31:05.953000 audit[2658]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2524 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353639646564353361353765303662643562663465633036313364 Jan 23 18:31:05.953000 audit: BPF prog-id=102 op=LOAD Jan 23 18:31:05.953000 audit[2658]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2524 pid=2658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353639646564353361353765303662643562663465633036313364 Jan 23 18:31:05.961284 systemd[1]: Started cri-containerd-34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6.scope - libcontainer container 34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6. Jan 23 18:31:05.965000 audit: BPF prog-id=103 op=LOAD Jan 23 18:31:05.966000 audit: BPF prog-id=104 op=LOAD Jan 23 18:31:05.966000 audit[2666]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2593 pid=2666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937343031623662356535666336353261323930343534303030373635 Jan 23 18:31:05.966000 audit: BPF prog-id=104 op=UNLOAD Jan 23 18:31:05.966000 audit[2666]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937343031623662356535666336353261323930343534303030373635 Jan 23 18:31:05.966000 audit: BPF prog-id=105 op=LOAD Jan 23 18:31:05.966000 audit[2666]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2593 pid=2666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937343031623662356535666336353261323930343534303030373635 Jan 23 18:31:05.966000 audit: BPF prog-id=106 op=LOAD Jan 23 18:31:05.966000 audit[2666]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2593 pid=2666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937343031623662356535666336353261323930343534303030373635 Jan 23 18:31:05.966000 audit: BPF prog-id=106 op=UNLOAD Jan 23 18:31:05.966000 audit[2666]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937343031623662356535666336353261323930343534303030373635 Jan 23 18:31:05.966000 audit: BPF prog-id=105 op=UNLOAD Jan 23 18:31:05.966000 audit[2666]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=2666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937343031623662356535666336353261323930343534303030373635 Jan 23 18:31:05.970002 kubelet[2483]: I0123 18:31:05.968834 2483 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.970002 kubelet[2483]: E0123 18:31:05.969177 2483 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.62.169.9:6443/api/v1/nodes\": dial tcp 46.62.169.9:6443: connect: connection refused" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:05.966000 audit: BPF prog-id=107 op=LOAD Jan 23 18:31:05.966000 audit[2666]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2593 pid=2666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3937343031623662356535666336353261323930343534303030373635 Jan 23 18:31:05.982000 audit: BPF prog-id=108 op=LOAD Jan 23 18:31:05.984000 audit: BPF prog-id=109 op=LOAD Jan 23 18:31:05.984000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=2558 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334353532653263646464636232663232306665326665396338623564 Jan 23 18:31:05.984000 audit: BPF prog-id=109 op=UNLOAD Jan 23 18:31:05.984000 audit[2689]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2558 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334353532653263646464636232663232306665326665396338623564 Jan 23 18:31:05.984000 audit: BPF prog-id=110 op=LOAD Jan 23 18:31:05.984000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=2558 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334353532653263646464636232663232306665326665396338623564 Jan 23 18:31:05.984000 audit: BPF prog-id=111 op=LOAD Jan 23 18:31:05.984000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=2558 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334353532653263646464636232663232306665326665396338623564 Jan 23 18:31:05.984000 audit: BPF prog-id=111 op=UNLOAD Jan 23 18:31:05.984000 audit[2689]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2558 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334353532653263646464636232663232306665326665396338623564 Jan 23 18:31:05.984000 audit: BPF prog-id=110 op=UNLOAD Jan 23 18:31:05.984000 audit[2689]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2558 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334353532653263646464636232663232306665326665396338623564 Jan 23 18:31:05.984000 audit: BPF prog-id=112 op=LOAD Jan 23 18:31:05.984000 audit[2689]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=2558 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:05.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334353532653263646464636232663232306665326665396338623564 Jan 23 18:31:05.996534 containerd[1684]: time="2026-01-23T18:31:05.996505624Z" level=info msg="StartContainer for \"b6569ded53a57e06bd5bf4ec0613d5adf7f494e587c38b56cdfb8453b7385dd6\" returns successfully" Jan 23 18:31:06.007922 kubelet[2483]: W0123 18:31:06.007877 2483 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://46.62.169.9:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-1-0-c-e2d32aff86&limit=500&resourceVersion=0": dial tcp 46.62.169.9:6443: connect: connection refused Jan 23 18:31:06.008208 kubelet[2483]: E0123 18:31:06.007930 2483 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://46.62.169.9:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-1-0-c-e2d32aff86&limit=500&resourceVersion=0\": dial tcp 46.62.169.9:6443: connect: connection refused" logger="UnhandledError" Jan 23 18:31:06.025993 containerd[1684]: time="2026-01-23T18:31:06.025045276Z" level=info msg="StartContainer for \"34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6\" returns successfully" Jan 23 18:31:06.035482 containerd[1684]: time="2026-01-23T18:31:06.035459810Z" level=info msg="StartContainer for \"97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565\" returns successfully" Jan 23 18:31:06.221472 kubelet[2483]: E0123 18:31:06.221403 2483 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:06.225438 kubelet[2483]: E0123 18:31:06.225306 2483 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:06.229098 kubelet[2483]: E0123 18:31:06.229082 2483 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:06.771342 kubelet[2483]: I0123 18:31:06.771314 2483 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:07.230987 kubelet[2483]: E0123 18:31:07.230903 2483 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:07.231284 kubelet[2483]: E0123 18:31:07.231183 2483 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:07.284806 kubelet[2483]: E0123 18:31:07.284749 2483 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:07.321668 kubelet[2483]: I0123 18:31:07.321638 2483 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:07.321668 kubelet[2483]: E0123 18:31:07.321668 2483 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4547-1-0-c-e2d32aff86\": node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:07.335966 kubelet[2483]: E0123 18:31:07.335939 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:07.436131 kubelet[2483]: E0123 18:31:07.436097 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:07.536630 kubelet[2483]: E0123 18:31:07.536283 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:07.638162 kubelet[2483]: E0123 18:31:07.638079 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:07.739015 kubelet[2483]: E0123 18:31:07.738878 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:07.839316 kubelet[2483]: E0123 18:31:07.839148 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:07.940394 kubelet[2483]: E0123 18:31:07.940325 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:08.041451 kubelet[2483]: E0123 18:31:08.041385 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:08.141749 kubelet[2483]: E0123 18:31:08.141576 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:08.242531 kubelet[2483]: E0123 18:31:08.242444 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:08.343130 kubelet[2483]: E0123 18:31:08.343054 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:08.444428 kubelet[2483]: E0123 18:31:08.444272 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:08.533769 kubelet[2483]: E0123 18:31:08.533712 2483 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:08.545499 kubelet[2483]: E0123 18:31:08.545251 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:08.647261 kubelet[2483]: E0123 18:31:08.647177 2483 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-1-0-c-e2d32aff86\" not found" Jan 23 18:31:08.694247 kubelet[2483]: I0123 18:31:08.693990 2483 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:08.704037 kubelet[2483]: I0123 18:31:08.703887 2483 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:08.708540 kubelet[2483]: I0123 18:31:08.708483 2483 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:09.169112 kubelet[2483]: I0123 18:31:09.168772 2483 apiserver.go:52] "Watching apiserver" Jan 23 18:31:09.188224 kubelet[2483]: I0123 18:31:09.188190 2483 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 23 18:31:09.250655 systemd[1]: Reload requested from client PID 2754 ('systemctl') (unit session-8.scope)... Jan 23 18:31:09.250688 systemd[1]: Reloading... Jan 23 18:31:09.386020 zram_generator::config[2801]: No configuration found. Jan 23 18:31:09.578940 systemd[1]: Reloading finished in 327 ms. Jan 23 18:31:09.606371 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:31:09.617302 systemd[1]: kubelet.service: Deactivated successfully. Jan 23 18:31:09.616000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:09.617608 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:31:09.618380 kernel: kauditd_printk_skb: 204 callbacks suppressed Jan 23 18:31:09.618445 kernel: audit: type=1131 audit(1769193069.616:402): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:09.619154 systemd[1]: kubelet.service: Consumed 773ms CPU time, 131.7M memory peak. Jan 23 18:31:09.624387 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 18:31:09.626000 audit: BPF prog-id=113 op=LOAD Jan 23 18:31:09.630024 kernel: audit: type=1334 audit(1769193069.626:403): prog-id=113 op=LOAD Jan 23 18:31:09.631000 audit: BPF prog-id=79 op=UNLOAD Jan 23 18:31:09.637629 kernel: audit: type=1334 audit(1769193069.631:404): prog-id=79 op=UNLOAD Jan 23 18:31:09.637703 kernel: audit: type=1334 audit(1769193069.633:405): prog-id=114 op=LOAD Jan 23 18:31:09.633000 audit: BPF prog-id=114 op=LOAD Jan 23 18:31:09.639874 kernel: audit: type=1334 audit(1769193069.633:406): prog-id=75 op=UNLOAD Jan 23 18:31:09.633000 audit: BPF prog-id=75 op=UNLOAD Jan 23 18:31:09.642030 kernel: audit: type=1334 audit(1769193069.634:407): prog-id=115 op=LOAD Jan 23 18:31:09.634000 audit: BPF prog-id=115 op=LOAD Jan 23 18:31:09.644265 kernel: audit: type=1334 audit(1769193069.634:408): prog-id=66 op=UNLOAD Jan 23 18:31:09.634000 audit: BPF prog-id=66 op=UNLOAD Jan 23 18:31:09.646480 kernel: audit: type=1334 audit(1769193069.636:409): prog-id=116 op=LOAD Jan 23 18:31:09.636000 audit: BPF prog-id=116 op=LOAD Jan 23 18:31:09.648633 kernel: audit: type=1334 audit(1769193069.636:410): prog-id=117 op=LOAD Jan 23 18:31:09.636000 audit: BPF prog-id=117 op=LOAD Jan 23 18:31:09.650785 kernel: audit: type=1334 audit(1769193069.636:411): prog-id=67 op=UNLOAD Jan 23 18:31:09.636000 audit: BPF prog-id=67 op=UNLOAD Jan 23 18:31:09.636000 audit: BPF prog-id=68 op=UNLOAD Jan 23 18:31:09.637000 audit: BPF prog-id=118 op=LOAD Jan 23 18:31:09.637000 audit: BPF prog-id=69 op=UNLOAD Jan 23 18:31:09.637000 audit: BPF prog-id=119 op=LOAD Jan 23 18:31:09.637000 audit: BPF prog-id=120 op=LOAD Jan 23 18:31:09.637000 audit: BPF prog-id=70 op=UNLOAD Jan 23 18:31:09.637000 audit: BPF prog-id=71 op=UNLOAD Jan 23 18:31:09.641000 audit: BPF prog-id=121 op=LOAD Jan 23 18:31:09.641000 audit: BPF prog-id=76 op=UNLOAD Jan 23 18:31:09.641000 audit: BPF prog-id=122 op=LOAD Jan 23 18:31:09.641000 audit: BPF prog-id=123 op=LOAD Jan 23 18:31:09.641000 audit: BPF prog-id=77 op=UNLOAD Jan 23 18:31:09.641000 audit: BPF prog-id=78 op=UNLOAD Jan 23 18:31:09.647000 audit: BPF prog-id=124 op=LOAD Jan 23 18:31:09.647000 audit: BPF prog-id=80 op=UNLOAD Jan 23 18:31:09.647000 audit: BPF prog-id=125 op=LOAD Jan 23 18:31:09.647000 audit: BPF prog-id=126 op=LOAD Jan 23 18:31:09.647000 audit: BPF prog-id=81 op=UNLOAD Jan 23 18:31:09.647000 audit: BPF prog-id=82 op=UNLOAD Jan 23 18:31:09.649000 audit: BPF prog-id=127 op=LOAD Jan 23 18:31:09.649000 audit: BPF prog-id=72 op=UNLOAD Jan 23 18:31:09.649000 audit: BPF prog-id=128 op=LOAD Jan 23 18:31:09.649000 audit: BPF prog-id=129 op=LOAD Jan 23 18:31:09.649000 audit: BPF prog-id=73 op=UNLOAD Jan 23 18:31:09.649000 audit: BPF prog-id=74 op=UNLOAD Jan 23 18:31:09.650000 audit: BPF prog-id=130 op=LOAD Jan 23 18:31:09.650000 audit: BPF prog-id=131 op=LOAD Jan 23 18:31:09.650000 audit: BPF prog-id=64 op=UNLOAD Jan 23 18:31:09.650000 audit: BPF prog-id=65 op=UNLOAD Jan 23 18:31:09.653000 audit: BPF prog-id=132 op=LOAD Jan 23 18:31:09.653000 audit: BPF prog-id=63 op=UNLOAD Jan 23 18:31:09.793893 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 18:31:09.793000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:09.803250 (kubelet)[2852]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 18:31:09.854426 kubelet[2852]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 18:31:09.854426 kubelet[2852]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 18:31:09.854426 kubelet[2852]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 18:31:09.854426 kubelet[2852]: I0123 18:31:09.854090 2852 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 18:31:09.864689 kubelet[2852]: I0123 18:31:09.864654 2852 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 23 18:31:09.866024 kubelet[2852]: I0123 18:31:09.864820 2852 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 18:31:09.866024 kubelet[2852]: I0123 18:31:09.865204 2852 server.go:954] "Client rotation is on, will bootstrap in background" Jan 23 18:31:09.867213 kubelet[2852]: I0123 18:31:09.867184 2852 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 23 18:31:09.870603 kubelet[2852]: I0123 18:31:09.870561 2852 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 18:31:09.877614 kubelet[2852]: I0123 18:31:09.877591 2852 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 18:31:09.880696 kubelet[2852]: I0123 18:31:09.880642 2852 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 23 18:31:09.880858 kubelet[2852]: I0123 18:31:09.880817 2852 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 18:31:09.880963 kubelet[2852]: I0123 18:31:09.880839 2852 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-1-0-c-e2d32aff86","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 18:31:09.880963 kubelet[2852]: I0123 18:31:09.880951 2852 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 18:31:09.880963 kubelet[2852]: I0123 18:31:09.880957 2852 container_manager_linux.go:304] "Creating device plugin manager" Jan 23 18:31:09.880963 kubelet[2852]: I0123 18:31:09.881014 2852 state_mem.go:36] "Initialized new in-memory state store" Jan 23 18:31:09.881345 kubelet[2852]: I0123 18:31:09.881151 2852 kubelet.go:446] "Attempting to sync node with API server" Jan 23 18:31:09.881345 kubelet[2852]: I0123 18:31:09.881169 2852 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 18:31:09.881345 kubelet[2852]: I0123 18:31:09.881185 2852 kubelet.go:352] "Adding apiserver pod source" Jan 23 18:31:09.881345 kubelet[2852]: I0123 18:31:09.881193 2852 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 18:31:09.887010 kubelet[2852]: I0123 18:31:09.885755 2852 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 23 18:31:09.887010 kubelet[2852]: I0123 18:31:09.886063 2852 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 23 18:31:09.887010 kubelet[2852]: I0123 18:31:09.886368 2852 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 23 18:31:09.887010 kubelet[2852]: I0123 18:31:09.886385 2852 server.go:1287] "Started kubelet" Jan 23 18:31:09.888239 kubelet[2852]: I0123 18:31:09.888208 2852 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 18:31:09.889522 kubelet[2852]: I0123 18:31:09.889491 2852 apiserver.go:52] "Watching apiserver" Jan 23 18:31:09.895604 kubelet[2852]: I0123 18:31:09.894965 2852 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 18:31:09.896193 kubelet[2852]: I0123 18:31:09.896074 2852 server.go:479] "Adding debug handlers to kubelet server" Jan 23 18:31:09.896871 kubelet[2852]: I0123 18:31:09.896826 2852 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 18:31:09.897020 kubelet[2852]: I0123 18:31:09.896994 2852 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 18:31:09.897884 kubelet[2852]: I0123 18:31:09.897853 2852 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 18:31:09.899609 kubelet[2852]: E0123 18:31:09.899580 2852 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 18:31:09.902100 kubelet[2852]: I0123 18:31:09.899810 2852 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 23 18:31:09.903518 kubelet[2852]: I0123 18:31:09.899822 2852 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 23 18:31:09.904361 kubelet[2852]: I0123 18:31:09.904167 2852 reconciler.go:26] "Reconciler: start to sync state" Jan 23 18:31:09.907007 kubelet[2852]: I0123 18:31:09.905881 2852 factory.go:221] Registration of the systemd container factory successfully Jan 23 18:31:09.907434 kubelet[2852]: I0123 18:31:09.906930 2852 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 23 18:31:09.908967 kubelet[2852]: I0123 18:31:09.908933 2852 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 18:31:09.909505 kubelet[2852]: I0123 18:31:09.909476 2852 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 23 18:31:09.909505 kubelet[2852]: I0123 18:31:09.909502 2852 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 23 18:31:09.909598 kubelet[2852]: I0123 18:31:09.909515 2852 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 18:31:09.909598 kubelet[2852]: I0123 18:31:09.909521 2852 kubelet.go:2382] "Starting kubelet main sync loop" Jan 23 18:31:09.909598 kubelet[2852]: E0123 18:31:09.909575 2852 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 18:31:09.918047 kubelet[2852]: I0123 18:31:09.917217 2852 factory.go:221] Registration of the containerd container factory successfully Jan 23 18:31:09.964277 kubelet[2852]: I0123 18:31:09.963429 2852 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 18:31:09.964277 kubelet[2852]: I0123 18:31:09.963458 2852 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 18:31:09.964277 kubelet[2852]: I0123 18:31:09.963474 2852 state_mem.go:36] "Initialized new in-memory state store" Jan 23 18:31:09.964277 kubelet[2852]: I0123 18:31:09.963603 2852 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 23 18:31:09.964277 kubelet[2852]: I0123 18:31:09.963610 2852 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 23 18:31:09.964277 kubelet[2852]: I0123 18:31:09.963623 2852 policy_none.go:49] "None policy: Start" Jan 23 18:31:09.964277 kubelet[2852]: I0123 18:31:09.963631 2852 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 23 18:31:09.964277 kubelet[2852]: I0123 18:31:09.963639 2852 state_mem.go:35] "Initializing new in-memory state store" Jan 23 18:31:09.964277 kubelet[2852]: I0123 18:31:09.963722 2852 state_mem.go:75] "Updated machine memory state" Jan 23 18:31:09.969452 kubelet[2852]: I0123 18:31:09.969422 2852 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 23 18:31:09.969669 kubelet[2852]: I0123 18:31:09.969641 2852 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 18:31:09.970042 kubelet[2852]: I0123 18:31:09.969801 2852 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 18:31:09.970690 kubelet[2852]: I0123 18:31:09.970660 2852 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 18:31:09.971765 kubelet[2852]: E0123 18:31:09.971733 2852 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 18:31:10.040559 kubelet[2852]: I0123 18:31:10.040460 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" podStartSLOduration=2.040436838 podStartE2EDuration="2.040436838s" podCreationTimestamp="2026-01-23 18:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:31:10.030896214 +0000 UTC m=+0.223930174" watchObservedRunningTime="2026-01-23 18:31:10.040436838 +0000 UTC m=+0.233470818" Jan 23 18:31:10.050945 kubelet[2852]: I0123 18:31:10.050857 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-1-0-c-e2d32aff86" podStartSLOduration=2.050841572 podStartE2EDuration="2.050841572s" podCreationTimestamp="2026-01-23 18:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:31:10.050763592 +0000 UTC m=+0.243797562" watchObservedRunningTime="2026-01-23 18:31:10.050841572 +0000 UTC m=+0.243875542" Jan 23 18:31:10.051355 kubelet[2852]: I0123 18:31:10.051275 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-1-0-c-e2d32aff86" podStartSLOduration=2.051265802 podStartE2EDuration="2.051265802s" podCreationTimestamp="2026-01-23 18:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:31:10.041631808 +0000 UTC m=+0.234665778" watchObservedRunningTime="2026-01-23 18:31:10.051265802 +0000 UTC m=+0.244299802" Jan 23 18:31:10.079145 kubelet[2852]: I0123 18:31:10.079097 2852 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.091222 kubelet[2852]: I0123 18:31:10.091153 2852 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.091327 kubelet[2852]: I0123 18:31:10.091295 2852 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.104739 kubelet[2852]: I0123 18:31:10.104351 2852 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 23 18:31:10.105539 kubelet[2852]: I0123 18:31:10.105497 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.105733 kubelet[2852]: I0123 18:31:10.105707 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/08048b8587436a94c660c878372e5eea-kubeconfig\") pod \"kube-scheduler-ci-4547-1-0-c-e2d32aff86\" (UID: \"08048b8587436a94c660c878372e5eea\") " pod="kube-system/kube-scheduler-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.106039 kubelet[2852]: I0123 18:31:10.105829 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4570ba94197a9e72761517edb8aa395e-k8s-certs\") pod \"kube-apiserver-ci-4547-1-0-c-e2d32aff86\" (UID: \"4570ba94197a9e72761517edb8aa395e\") " pod="kube-system/kube-apiserver-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.106039 kubelet[2852]: I0123 18:31:10.105862 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4570ba94197a9e72761517edb8aa395e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-1-0-c-e2d32aff86\" (UID: \"4570ba94197a9e72761517edb8aa395e\") " pod="kube-system/kube-apiserver-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.106039 kubelet[2852]: I0123 18:31:10.105890 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-ca-certs\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.106039 kubelet[2852]: I0123 18:31:10.105912 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4570ba94197a9e72761517edb8aa395e-ca-certs\") pod \"kube-apiserver-ci-4547-1-0-c-e2d32aff86\" (UID: \"4570ba94197a9e72761517edb8aa395e\") " pod="kube-system/kube-apiserver-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.106039 kubelet[2852]: I0123 18:31:10.105936 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-k8s-certs\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.106286 kubelet[2852]: I0123 18:31:10.105959 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-kubeconfig\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:10.106797 kubelet[2852]: I0123 18:31:10.106350 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f579a4e40d79e63645203ec87bc306f5-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-1-0-c-e2d32aff86\" (UID: \"f579a4e40d79e63645203ec87bc306f5\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:16.207031 kubelet[2852]: I0123 18:31:16.206937 2852 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 23 18:31:16.207862 containerd[1684]: time="2026-01-23T18:31:16.207676767Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 23 18:31:16.209120 kubelet[2852]: I0123 18:31:16.208200 2852 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 23 18:31:17.185601 systemd[1]: Created slice kubepods-besteffort-pod4dc9ed46_7917_4b86_99b9_bfccf8b332f6.slice - libcontainer container kubepods-besteffort-pod4dc9ed46_7917_4b86_99b9_bfccf8b332f6.slice. Jan 23 18:31:17.347685 kubelet[2852]: I0123 18:31:17.347524 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4dc9ed46-7917-4b86-99b9-bfccf8b332f6-xtables-lock\") pod \"kube-proxy-87tg5\" (UID: \"4dc9ed46-7917-4b86-99b9-bfccf8b332f6\") " pod="kube-system/kube-proxy-87tg5" Jan 23 18:31:17.349562 kubelet[2852]: I0123 18:31:17.348319 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4dc9ed46-7917-4b86-99b9-bfccf8b332f6-lib-modules\") pod \"kube-proxy-87tg5\" (UID: \"4dc9ed46-7917-4b86-99b9-bfccf8b332f6\") " pod="kube-system/kube-proxy-87tg5" Jan 23 18:31:17.349562 kubelet[2852]: I0123 18:31:17.348662 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a35443c2-716a-4cc2-b350-77c80906f366-var-lib-calico\") pod \"tigera-operator-7dcd859c48-pmgq6\" (UID: \"a35443c2-716a-4cc2-b350-77c80906f366\") " pod="tigera-operator/tigera-operator-7dcd859c48-pmgq6" Jan 23 18:31:17.349562 kubelet[2852]: I0123 18:31:17.348696 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/4dc9ed46-7917-4b86-99b9-bfccf8b332f6-kube-proxy\") pod \"kube-proxy-87tg5\" (UID: \"4dc9ed46-7917-4b86-99b9-bfccf8b332f6\") " pod="kube-system/kube-proxy-87tg5" Jan 23 18:31:17.349562 kubelet[2852]: I0123 18:31:17.348718 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxz6v\" (UniqueName: \"kubernetes.io/projected/4dc9ed46-7917-4b86-99b9-bfccf8b332f6-kube-api-access-xxz6v\") pod \"kube-proxy-87tg5\" (UID: \"4dc9ed46-7917-4b86-99b9-bfccf8b332f6\") " pod="kube-system/kube-proxy-87tg5" Jan 23 18:31:17.349562 kubelet[2852]: I0123 18:31:17.349502 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbllz\" (UniqueName: \"kubernetes.io/projected/a35443c2-716a-4cc2-b350-77c80906f366-kube-api-access-fbllz\") pod \"tigera-operator-7dcd859c48-pmgq6\" (UID: \"a35443c2-716a-4cc2-b350-77c80906f366\") " pod="tigera-operator/tigera-operator-7dcd859c48-pmgq6" Jan 23 18:31:17.360702 systemd[1]: Created slice kubepods-besteffort-poda35443c2_716a_4cc2_b350_77c80906f366.slice - libcontainer container kubepods-besteffort-poda35443c2_716a_4cc2_b350_77c80906f366.slice. Jan 23 18:31:17.497411 containerd[1684]: time="2026-01-23T18:31:17.497265464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-87tg5,Uid:4dc9ed46-7917-4b86-99b9-bfccf8b332f6,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:17.527191 containerd[1684]: time="2026-01-23T18:31:17.527044796Z" level=info msg="connecting to shim 60d590f7c0dbf42c8264e14ae6dcfe59e81c81710f099ec5c7150bba6bcc8130" address="unix:///run/containerd/s/c026cf080fb439a252f1a66f2663e3e982b11ff83905f340b80284aa01d47e65" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:17.578286 systemd[1]: Started cri-containerd-60d590f7c0dbf42c8264e14ae6dcfe59e81c81710f099ec5c7150bba6bcc8130.scope - libcontainer container 60d590f7c0dbf42c8264e14ae6dcfe59e81c81710f099ec5c7150bba6bcc8130. Jan 23 18:31:17.598000 audit: BPF prog-id=133 op=LOAD Jan 23 18:31:17.602427 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 23 18:31:17.602494 kernel: audit: type=1334 audit(1769193077.598:444): prog-id=133 op=LOAD Jan 23 18:31:17.601000 audit: BPF prog-id=134 op=LOAD Jan 23 18:31:17.601000 audit[2919]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.607774 kernel: audit: type=1334 audit(1769193077.601:445): prog-id=134 op=LOAD Jan 23 18:31:17.607813 kernel: audit: type=1300 audit(1769193077.601:445): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.613629 kernel: audit: type=1327 audit(1769193077.601:445): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.619686 kernel: audit: type=1334 audit(1769193077.601:446): prog-id=134 op=UNLOAD Jan 23 18:31:17.601000 audit: BPF prog-id=134 op=UNLOAD Jan 23 18:31:17.601000 audit[2919]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.633139 kernel: audit: type=1300 audit(1769193077.601:446): arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.633183 kernel: audit: type=1327 audit(1769193077.601:446): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.601000 audit: BPF prog-id=135 op=LOAD Jan 23 18:31:17.634799 kernel: audit: type=1334 audit(1769193077.601:447): prog-id=135 op=LOAD Jan 23 18:31:17.636132 kernel: audit: type=1300 audit(1769193077.601:447): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.601000 audit[2919]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.638992 containerd[1684]: time="2026-01-23T18:31:17.638953973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-87tg5,Uid:4dc9ed46-7917-4b86-99b9-bfccf8b332f6,Namespace:kube-system,Attempt:0,} returns sandbox id \"60d590f7c0dbf42c8264e14ae6dcfe59e81c81710f099ec5c7150bba6bcc8130\"" Jan 23 18:31:17.648451 kernel: audit: type=1327 audit(1769193077.601:447): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.648536 containerd[1684]: time="2026-01-23T18:31:17.642779844Z" level=info msg="CreateContainer within sandbox \"60d590f7c0dbf42c8264e14ae6dcfe59e81c81710f099ec5c7150bba6bcc8130\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 23 18:31:17.601000 audit: BPF prog-id=136 op=LOAD Jan 23 18:31:17.601000 audit[2919]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.601000 audit: BPF prog-id=136 op=UNLOAD Jan 23 18:31:17.601000 audit[2919]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.601000 audit: BPF prog-id=135 op=UNLOAD Jan 23 18:31:17.601000 audit[2919]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.601000 audit: BPF prog-id=137 op=LOAD Jan 23 18:31:17.601000 audit[2919]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2906 pid=2919 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.601000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3630643539306637633064626634326338323634653134616536646366 Jan 23 18:31:17.655573 containerd[1684]: time="2026-01-23T18:31:17.655501280Z" level=info msg="Container 7b9cf1d9f4e99d7a98756ca466490c4a46d3efb5592df859ad647ff99d9269df: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:17.661342 containerd[1684]: time="2026-01-23T18:31:17.661260012Z" level=info msg="CreateContainer within sandbox \"60d590f7c0dbf42c8264e14ae6dcfe59e81c81710f099ec5c7150bba6bcc8130\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7b9cf1d9f4e99d7a98756ca466490c4a46d3efb5592df859ad647ff99d9269df\"" Jan 23 18:31:17.661808 containerd[1684]: time="2026-01-23T18:31:17.661757222Z" level=info msg="StartContainer for \"7b9cf1d9f4e99d7a98756ca466490c4a46d3efb5592df859ad647ff99d9269df\"" Jan 23 18:31:17.662699 containerd[1684]: time="2026-01-23T18:31:17.662679403Z" level=info msg="connecting to shim 7b9cf1d9f4e99d7a98756ca466490c4a46d3efb5592df859ad647ff99d9269df" address="unix:///run/containerd/s/c026cf080fb439a252f1a66f2663e3e982b11ff83905f340b80284aa01d47e65" protocol=ttrpc version=3 Jan 23 18:31:17.667629 containerd[1684]: time="2026-01-23T18:31:17.667500395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-pmgq6,Uid:a35443c2-716a-4cc2-b350-77c80906f366,Namespace:tigera-operator,Attempt:0,}" Jan 23 18:31:17.680692 systemd[1]: Started cri-containerd-7b9cf1d9f4e99d7a98756ca466490c4a46d3efb5592df859ad647ff99d9269df.scope - libcontainer container 7b9cf1d9f4e99d7a98756ca466490c4a46d3efb5592df859ad647ff99d9269df. Jan 23 18:31:17.689786 containerd[1684]: time="2026-01-23T18:31:17.689714684Z" level=info msg="connecting to shim d7f4ac683b9ea5d2a2db7684c00b06a7eece23cac453fa1ac79e19c2b321cad5" address="unix:///run/containerd/s/ef040c12e9c3e533f13edac7ae2a78f3c6bc1a11d0df1b955880e098af1405bd" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:17.709161 systemd[1]: Started cri-containerd-d7f4ac683b9ea5d2a2db7684c00b06a7eece23cac453fa1ac79e19c2b321cad5.scope - libcontainer container d7f4ac683b9ea5d2a2db7684c00b06a7eece23cac453fa1ac79e19c2b321cad5. Jan 23 18:31:17.716000 audit: BPF prog-id=138 op=LOAD Jan 23 18:31:17.716000 audit[2946]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=2906 pid=2946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.716000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762396366316439663465393964376139383735366361343636343930 Jan 23 18:31:17.716000 audit: BPF prog-id=139 op=LOAD Jan 23 18:31:17.716000 audit[2946]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=2906 pid=2946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.716000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762396366316439663465393964376139383735366361343636343930 Jan 23 18:31:17.716000 audit: BPF prog-id=139 op=UNLOAD Jan 23 18:31:17.716000 audit[2946]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2906 pid=2946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.716000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762396366316439663465393964376139383735366361343636343930 Jan 23 18:31:17.716000 audit: BPF prog-id=138 op=UNLOAD Jan 23 18:31:17.716000 audit[2946]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2906 pid=2946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.716000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762396366316439663465393964376139383735366361343636343930 Jan 23 18:31:17.716000 audit: BPF prog-id=140 op=LOAD Jan 23 18:31:17.716000 audit[2946]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=2906 pid=2946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.716000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3762396366316439663465393964376139383735366361343636343930 Jan 23 18:31:17.718000 audit: BPF prog-id=141 op=LOAD Jan 23 18:31:17.718000 audit: BPF prog-id=142 op=LOAD Jan 23 18:31:17.718000 audit[2986]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2973 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437663461633638336239656135643261326462373638346330306230 Jan 23 18:31:17.718000 audit: BPF prog-id=142 op=UNLOAD Jan 23 18:31:17.718000 audit[2986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437663461633638336239656135643261326462373638346330306230 Jan 23 18:31:17.718000 audit: BPF prog-id=143 op=LOAD Jan 23 18:31:17.718000 audit[2986]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2973 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437663461633638336239656135643261326462373638346330306230 Jan 23 18:31:17.718000 audit: BPF prog-id=144 op=LOAD Jan 23 18:31:17.718000 audit[2986]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2973 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437663461633638336239656135643261326462373638346330306230 Jan 23 18:31:17.718000 audit: BPF prog-id=144 op=UNLOAD Jan 23 18:31:17.718000 audit[2986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437663461633638336239656135643261326462373638346330306230 Jan 23 18:31:17.718000 audit: BPF prog-id=143 op=UNLOAD Jan 23 18:31:17.718000 audit[2986]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437663461633638336239656135643261326462373638346330306230 Jan 23 18:31:17.718000 audit: BPF prog-id=145 op=LOAD Jan 23 18:31:17.718000 audit[2986]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2973 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.718000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437663461633638336239656135643261326462373638346330306230 Jan 23 18:31:17.740071 containerd[1684]: time="2026-01-23T18:31:17.740041735Z" level=info msg="StartContainer for \"7b9cf1d9f4e99d7a98756ca466490c4a46d3efb5592df859ad647ff99d9269df\" returns successfully" Jan 23 18:31:17.760028 containerd[1684]: time="2026-01-23T18:31:17.757432562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-pmgq6,Uid:a35443c2-716a-4cc2-b350-77c80906f366,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"d7f4ac683b9ea5d2a2db7684c00b06a7eece23cac453fa1ac79e19c2b321cad5\"" Jan 23 18:31:17.760028 containerd[1684]: time="2026-01-23T18:31:17.759067903Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 23 18:31:17.869000 audit[3054]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:17.869000 audit[3054]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc2b68be10 a2=0 a3=7ffc2b68bdfc items=0 ppid=2960 pid=3054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.869000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 23 18:31:17.871000 audit[3056]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:17.871000 audit[3056]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffda791db20 a2=0 a3=7ffda791db0c items=0 ppid=2960 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.871000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 23 18:31:17.874000 audit[3055]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=3055 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:17.874000 audit[3055]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdff411230 a2=0 a3=7ffdff41121c items=0 ppid=2960 pid=3055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.874000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 23 18:31:17.875000 audit[3058]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=3058 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:17.875000 audit[3058]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd225aee50 a2=0 a3=7ffd225aee3c items=0 ppid=2960 pid=3058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.875000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 23 18:31:17.876000 audit[3057]: NETFILTER_CFG table=filter:58 family=2 entries=1 op=nft_register_chain pid=3057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:17.876000 audit[3057]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc38a11c80 a2=0 a3=7ffc38a11c6c items=0 ppid=2960 pid=3057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.876000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 23 18:31:17.878000 audit[3059]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3059 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:17.878000 audit[3059]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdd090bf90 a2=0 a3=7ffdd090bf7c items=0 ppid=2960 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.878000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 23 18:31:17.978659 kubelet[2852]: I0123 18:31:17.978591 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-87tg5" podStartSLOduration=0.978571014 podStartE2EDuration="978.571014ms" podCreationTimestamp="2026-01-23 18:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:31:17.978430834 +0000 UTC m=+8.171464804" watchObservedRunningTime="2026-01-23 18:31:17.978571014 +0000 UTC m=+8.171604984" Jan 23 18:31:17.985000 audit[3061]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3061 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:17.985000 audit[3061]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffd79b8e6f0 a2=0 a3=7ffd79b8e6dc items=0 ppid=2960 pid=3061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.985000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 23 18:31:17.992000 audit[3063]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:17.992000 audit[3063]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffe6a190bc0 a2=0 a3=7ffe6a190bac items=0 ppid=2960 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:17.992000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 23 18:31:18.004000 audit[3066]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.004000 audit[3066]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff34f46bf0 a2=0 a3=7fff34f46bdc items=0 ppid=2960 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.004000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 23 18:31:18.009000 audit[3067]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3067 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.009000 audit[3067]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc3693c9d0 a2=0 a3=7ffc3693c9bc items=0 ppid=2960 pid=3067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.009000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 23 18:31:18.016000 audit[3069]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3069 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.016000 audit[3069]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe1257cf00 a2=0 a3=7ffe1257ceec items=0 ppid=2960 pid=3069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.016000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 23 18:31:18.019000 audit[3070]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3070 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.019000 audit[3070]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe1d8ea2e0 a2=0 a3=7ffe1d8ea2cc items=0 ppid=2960 pid=3070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.019000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 23 18:31:18.026000 audit[3072]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3072 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.026000 audit[3072]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffff31f3e30 a2=0 a3=7ffff31f3e1c items=0 ppid=2960 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.026000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 23 18:31:18.035000 audit[3075]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3075 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.035000 audit[3075]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff0e3ab8f0 a2=0 a3=7fff0e3ab8dc items=0 ppid=2960 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.035000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 23 18:31:18.038000 audit[3076]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3076 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.038000 audit[3076]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcea304d60 a2=0 a3=7ffcea304d4c items=0 ppid=2960 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.038000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 23 18:31:18.044000 audit[3078]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3078 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.044000 audit[3078]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc253818d0 a2=0 a3=7ffc253818bc items=0 ppid=2960 pid=3078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.044000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 23 18:31:18.047000 audit[3079]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3079 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.047000 audit[3079]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc527ea030 a2=0 a3=7ffc527ea01c items=0 ppid=2960 pid=3079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.047000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 23 18:31:18.054000 audit[3081]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3081 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.054000 audit[3081]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffcc6b123c0 a2=0 a3=7ffcc6b123ac items=0 ppid=2960 pid=3081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.054000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 23 18:31:18.065000 audit[3084]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3084 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.065000 audit[3084]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffda624eae0 a2=0 a3=7ffda624eacc items=0 ppid=2960 pid=3084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.065000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 23 18:31:18.074000 audit[3087]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3087 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.074000 audit[3087]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd1c5e9b20 a2=0 a3=7ffd1c5e9b0c items=0 ppid=2960 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.074000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 23 18:31:18.077000 audit[3088]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3088 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.077000 audit[3088]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcd437e010 a2=0 a3=7ffcd437dffc items=0 ppid=2960 pid=3088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.077000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 23 18:31:18.083000 audit[3090]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3090 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.083000 audit[3090]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffcb0cdeba0 a2=0 a3=7ffcb0cdeb8c items=0 ppid=2960 pid=3090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.083000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 18:31:18.092000 audit[3093]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3093 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.092000 audit[3093]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff9b9fb720 a2=0 a3=7fff9b9fb70c items=0 ppid=2960 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.092000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 18:31:18.095000 audit[3094]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3094 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.095000 audit[3094]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee860dab0 a2=0 a3=7ffee860da9c items=0 ppid=2960 pid=3094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.095000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 23 18:31:18.101000 audit[3096]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3096 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 18:31:18.101000 audit[3096]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7fffd4dfa000 a2=0 a3=7fffd4df9fec items=0 ppid=2960 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.101000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 23 18:31:18.144000 audit[3102]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3102 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:18.144000 audit[3102]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffdc65a9b40 a2=0 a3=7ffdc65a9b2c items=0 ppid=2960 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.144000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:18.156000 audit[3102]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3102 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:18.156000 audit[3102]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffdc65a9b40 a2=0 a3=7ffdc65a9b2c items=0 ppid=2960 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.156000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:18.160000 audit[3107]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3107 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.160000 audit[3107]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe4043b870 a2=0 a3=7ffe4043b85c items=0 ppid=2960 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.160000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 23 18:31:18.166000 audit[3109]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.166000 audit[3109]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffe9fe5ed30 a2=0 a3=7ffe9fe5ed1c items=0 ppid=2960 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.166000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 23 18:31:18.176000 audit[3112]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.176000 audit[3112]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc8e1bf030 a2=0 a3=7ffc8e1bf01c items=0 ppid=2960 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.176000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 23 18:31:18.179000 audit[3113]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3113 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.179000 audit[3113]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffee026f280 a2=0 a3=7ffee026f26c items=0 ppid=2960 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.179000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 23 18:31:18.185000 audit[3115]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3115 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.185000 audit[3115]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffec4391970 a2=0 a3=7ffec439195c items=0 ppid=2960 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.185000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 23 18:31:18.188000 audit[3116]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3116 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.188000 audit[3116]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd5219f290 a2=0 a3=7ffd5219f27c items=0 ppid=2960 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.188000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 23 18:31:18.194000 audit[3118]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3118 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.194000 audit[3118]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd6eb72fc0 a2=0 a3=7ffd6eb72fac items=0 ppid=2960 pid=3118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.194000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 23 18:31:18.205000 audit[3121]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.205000 audit[3121]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffc313be2a0 a2=0 a3=7ffc313be28c items=0 ppid=2960 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.205000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 23 18:31:18.208000 audit[3122]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3122 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.208000 audit[3122]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffefb8b5120 a2=0 a3=7ffefb8b510c items=0 ppid=2960 pid=3122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.208000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 23 18:31:18.216000 audit[3124]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.216000 audit[3124]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc279f64d0 a2=0 a3=7ffc279f64bc items=0 ppid=2960 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.216000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 23 18:31:18.219000 audit[3125]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3125 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.219000 audit[3125]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcdbd1ddd0 a2=0 a3=7ffcdbd1ddbc items=0 ppid=2960 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.219000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 23 18:31:18.226000 audit[3127]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.226000 audit[3127]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd9005ede0 a2=0 a3=7ffd9005edcc items=0 ppid=2960 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.226000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 23 18:31:18.237000 audit[3130]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3130 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.237000 audit[3130]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe4fb9dc40 a2=0 a3=7ffe4fb9dc2c items=0 ppid=2960 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.237000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 23 18:31:18.246000 audit[3133]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3133 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.246000 audit[3133]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdc3da5b20 a2=0 a3=7ffdc3da5b0c items=0 ppid=2960 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.246000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 23 18:31:18.248000 audit[3134]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3134 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.248000 audit[3134]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe4ac1ab10 a2=0 a3=7ffe4ac1aafc items=0 ppid=2960 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.248000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 23 18:31:18.255000 audit[3136]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3136 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.255000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc2ba482b0 a2=0 a3=7ffc2ba4829c items=0 ppid=2960 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.255000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 18:31:18.263000 audit[3139]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3139 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.263000 audit[3139]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fffdb740680 a2=0 a3=7fffdb74066c items=0 ppid=2960 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.263000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 18:31:18.267000 audit[3140]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3140 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.267000 audit[3140]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc104ee5d0 a2=0 a3=7ffc104ee5bc items=0 ppid=2960 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.267000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 23 18:31:18.273000 audit[3142]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3142 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.273000 audit[3142]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffc36bf0430 a2=0 a3=7ffc36bf041c items=0 ppid=2960 pid=3142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.273000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 23 18:31:18.276000 audit[3143]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3143 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.276000 audit[3143]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd1d1035c0 a2=0 a3=7ffd1d1035ac items=0 ppid=2960 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.276000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 23 18:31:18.282000 audit[3145]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3145 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.282000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffccf222100 a2=0 a3=7ffccf2220ec items=0 ppid=2960 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.282000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 18:31:18.291000 audit[3148]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3148 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 18:31:18.291000 audit[3148]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffeaacd0400 a2=0 a3=7ffeaacd03ec items=0 ppid=2960 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.291000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 18:31:18.299000 audit[3150]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3150 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 23 18:31:18.299000 audit[3150]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7ffd4e8e9e90 a2=0 a3=7ffd4e8e9e7c items=0 ppid=2960 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.299000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:18.300000 audit[3150]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3150 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 23 18:31:18.300000 audit[3150]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffd4e8e9e90 a2=0 a3=7ffd4e8e9e7c items=0 ppid=2960 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:18.300000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:19.887957 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3624906080.mount: Deactivated successfully. Jan 23 18:31:20.532061 containerd[1684]: time="2026-01-23T18:31:20.532021888Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:20.533172 containerd[1684]: time="2026-01-23T18:31:20.533053458Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 23 18:31:20.533986 containerd[1684]: time="2026-01-23T18:31:20.533953248Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:20.535686 containerd[1684]: time="2026-01-23T18:31:20.535667519Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:20.536453 containerd[1684]: time="2026-01-23T18:31:20.536261909Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.777176256s" Jan 23 18:31:20.536453 containerd[1684]: time="2026-01-23T18:31:20.536281449Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 23 18:31:20.537470 containerd[1684]: time="2026-01-23T18:31:20.537454020Z" level=info msg="CreateContainer within sandbox \"d7f4ac683b9ea5d2a2db7684c00b06a7eece23cac453fa1ac79e19c2b321cad5\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 23 18:31:20.546422 containerd[1684]: time="2026-01-23T18:31:20.546071283Z" level=info msg="Container 71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:20.548622 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3254194569.mount: Deactivated successfully. Jan 23 18:31:20.560510 containerd[1684]: time="2026-01-23T18:31:20.560468109Z" level=info msg="CreateContainer within sandbox \"d7f4ac683b9ea5d2a2db7684c00b06a7eece23cac453fa1ac79e19c2b321cad5\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3\"" Jan 23 18:31:20.561372 containerd[1684]: time="2026-01-23T18:31:20.561350250Z" level=info msg="StartContainer for \"71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3\"" Jan 23 18:31:20.562437 containerd[1684]: time="2026-01-23T18:31:20.562399370Z" level=info msg="connecting to shim 71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3" address="unix:///run/containerd/s/ef040c12e9c3e533f13edac7ae2a78f3c6bc1a11d0df1b955880e098af1405bd" protocol=ttrpc version=3 Jan 23 18:31:20.580108 systemd[1]: Started cri-containerd-71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3.scope - libcontainer container 71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3. Jan 23 18:31:20.591000 audit: BPF prog-id=146 op=LOAD Jan 23 18:31:20.592000 audit: BPF prog-id=147 op=LOAD Jan 23 18:31:20.592000 audit[3159]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2973 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:20.592000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731613132653864306533656339343038353265346135646264633762 Jan 23 18:31:20.592000 audit: BPF prog-id=147 op=UNLOAD Jan 23 18:31:20.592000 audit[3159]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:20.592000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731613132653864306533656339343038353265346135646264633762 Jan 23 18:31:20.592000 audit: BPF prog-id=148 op=LOAD Jan 23 18:31:20.592000 audit[3159]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2973 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:20.592000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731613132653864306533656339343038353265346135646264633762 Jan 23 18:31:20.592000 audit: BPF prog-id=149 op=LOAD Jan 23 18:31:20.592000 audit[3159]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2973 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:20.592000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731613132653864306533656339343038353265346135646264633762 Jan 23 18:31:20.592000 audit: BPF prog-id=149 op=UNLOAD Jan 23 18:31:20.592000 audit[3159]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:20.592000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731613132653864306533656339343038353265346135646264633762 Jan 23 18:31:20.592000 audit: BPF prog-id=148 op=UNLOAD Jan 23 18:31:20.592000 audit[3159]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:20.592000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731613132653864306533656339343038353265346135646264633762 Jan 23 18:31:20.592000 audit: BPF prog-id=150 op=LOAD Jan 23 18:31:20.592000 audit[3159]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2973 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:20.592000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731613132653864306533656339343038353265346135646264633762 Jan 23 18:31:20.611774 containerd[1684]: time="2026-01-23T18:31:20.611718291Z" level=info msg="StartContainer for \"71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3\" returns successfully" Jan 23 18:31:21.471235 systemd[1]: Started sshd@7-46.62.169.9:22-101.249.61.144:49171.service - OpenSSH per-connection server daemon (101.249.61.144:49171). Jan 23 18:31:21.470000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-46.62.169.9:22-101.249.61.144:49171 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:21.485045 kubelet[2852]: I0123 18:31:21.484902 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-pmgq6" podStartSLOduration=1.706540506 podStartE2EDuration="4.484879954s" podCreationTimestamp="2026-01-23 18:31:17 +0000 UTC" firstStartedPulling="2026-01-23 18:31:17.758384022 +0000 UTC m=+7.951417952" lastFinishedPulling="2026-01-23 18:31:20.53672347 +0000 UTC m=+10.729757400" observedRunningTime="2026-01-23 18:31:21.012313418 +0000 UTC m=+11.205347388" watchObservedRunningTime="2026-01-23 18:31:21.484879954 +0000 UTC m=+11.677913924" Jan 23 18:31:22.040219 systemd[1]: Started sshd@8-46.62.169.9:22-110.177.180.64:19481.service - OpenSSH per-connection server daemon (110.177.180.64:19481). Jan 23 18:31:22.039000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-46.62.169.9:22-110.177.180.64:19481 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:22.067213 sshd[3195]: banner exchange: Connection from 110.177.180.64 port 19481: invalid format Jan 23 18:31:22.068042 systemd[1]: sshd@8-46.62.169.9:22-110.177.180.64:19481.service: Deactivated successfully. Jan 23 18:31:22.067000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-46.62.169.9:22-110.177.180.64:19481 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:22.881696 kernel: kauditd_printk_skb: 227 callbacks suppressed Jan 23 18:31:22.881772 kernel: audit: type=1130 audit(1769193082.874:527): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-46.62.169.9:22-59.52.101.28:61358 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:22.874000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-46.62.169.9:22-59.52.101.28:61358 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:22.875306 systemd[1]: Started sshd@9-46.62.169.9:22-59.52.101.28:61358.service - OpenSSH per-connection server daemon (59.52.101.28:61358). Jan 23 18:31:24.180968 update_engine[1648]: I20260123 18:31:24.180171 1648 update_attempter.cc:509] Updating boot flags... Jan 23 18:31:24.511325 systemd[1]: Started sshd@10-46.62.169.9:22-124.117.193.205:17012.service - OpenSSH per-connection server daemon (124.117.193.205:17012). Jan 23 18:31:24.510000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-46.62.169.9:22-124.117.193.205:17012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:24.521023 kernel: audit: type=1130 audit(1769193084.510:528): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-46.62.169.9:22-124.117.193.205:17012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:25.589814 sshd[3192]: banner exchange: Connection from 101.249.61.144 port 49171: invalid format Jan 23 18:31:25.592204 systemd[1]: sshd@7-46.62.169.9:22-101.249.61.144:49171.service: Deactivated successfully. Jan 23 18:31:25.592000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-46.62.169.9:22-101.249.61.144:49171 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:25.605052 kernel: audit: type=1131 audit(1769193085.592:529): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-46.62.169.9:22-101.249.61.144:49171 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:25.850743 sudo[1909]: pam_unix(sudo:session): session closed for user root Jan 23 18:31:25.849000 audit[1909]: USER_END pid=1909 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:31:25.856103 kernel: audit: type=1106 audit(1769193085.849:530): pid=1909 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:31:25.849000 audit[1909]: CRED_DISP pid=1909 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:31:25.866007 kernel: audit: type=1104 audit(1769193085.849:531): pid=1909 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 18:31:25.972593 sshd[1908]: Connection closed by 4.153.228.146 port 40820 Jan 23 18:31:25.974760 sshd-session[1904]: pam_unix(sshd:session): session closed for user core Jan 23 18:31:25.985088 kernel: audit: type=1106 audit(1769193085.975:532): pid=1904 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:31:25.975000 audit[1904]: USER_END pid=1904 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:31:25.986986 systemd[1]: sshd@6-46.62.169.9:22-4.153.228.146:40820.service: Deactivated successfully. Jan 23 18:31:25.988920 systemd[1]: session-8.scope: Deactivated successfully. Jan 23 18:31:25.991248 systemd[1]: session-8.scope: Consumed 5.015s CPU time, 231M memory peak. Jan 23 18:31:25.975000 audit[1904]: CRED_DISP pid=1904 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:31:25.998990 kernel: audit: type=1104 audit(1769193085.975:533): pid=1904 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:31:25.999183 systemd-logind[1647]: Session 8 logged out. Waiting for processes to exit. Jan 23 18:31:25.986000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-46.62.169.9:22-4.153.228.146:40820 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:26.005932 systemd-logind[1647]: Removed session 8. Jan 23 18:31:26.006018 kernel: audit: type=1131 audit(1769193085.986:534): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-46.62.169.9:22-4.153.228.146:40820 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:26.574000 audit[3278]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:26.580100 kernel: audit: type=1325 audit(1769193086.574:535): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:26.574000 audit[3278]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc5fd236c0 a2=0 a3=7ffc5fd236ac items=0 ppid=2960 pid=3278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.590000 kernel: audit: type=1300 audit(1769193086.574:535): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffc5fd236c0 a2=0 a3=7ffc5fd236ac items=0 ppid=2960 pid=3278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.574000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:26.584000 audit[3278]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3278 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:26.584000 audit[3278]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc5fd236c0 a2=0 a3=0 items=0 ppid=2960 pid=3278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.584000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:26.601000 audit[3280]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3280 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:26.601000 audit[3280]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fffbedb7a30 a2=0 a3=7fffbedb7a1c items=0 ppid=2960 pid=3280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.601000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:26.606000 audit[3280]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3280 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:26.606000 audit[3280]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffbedb7a30 a2=0 a3=0 items=0 ppid=2960 pid=3280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:26.606000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:26.656318 sshd[3253]: banner exchange: Connection from 124.117.193.205 port 17012: invalid format Jan 23 18:31:26.657122 systemd[1]: sshd@10-46.62.169.9:22-124.117.193.205:17012.service: Deactivated successfully. Jan 23 18:31:26.657000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-46.62.169.9:22-124.117.193.205:17012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:28.146014 kernel: kauditd_printk_skb: 11 callbacks suppressed Jan 23 18:31:28.146170 kernel: audit: type=1325 audit(1769193088.141:540): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3284 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:28.141000 audit[3284]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3284 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:28.141000 audit[3284]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffc8a4f50c0 a2=0 a3=7ffc8a4f50ac items=0 ppid=2960 pid=3284 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.159078 kernel: audit: type=1300 audit(1769193088.141:540): arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffc8a4f50c0 a2=0 a3=7ffc8a4f50ac items=0 ppid=2960 pid=3284 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.141000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:28.167017 kernel: audit: type=1327 audit(1769193088.141:540): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:28.160000 audit[3284]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3284 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:28.160000 audit[3284]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc8a4f50c0 a2=0 a3=0 items=0 ppid=2960 pid=3284 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.175839 kernel: audit: type=1325 audit(1769193088.160:541): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3284 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:28.175893 kernel: audit: type=1300 audit(1769193088.160:541): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc8a4f50c0 a2=0 a3=0 items=0 ppid=2960 pid=3284 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:28.160000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:28.183399 kernel: audit: type=1327 audit(1769193088.160:541): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:29.181000 audit[3286]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:29.181000 audit[3286]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffd178eed80 a2=0 a3=7ffd178eed6c items=0 ppid=2960 pid=3286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.189804 kernel: audit: type=1325 audit(1769193089.181:542): table=filter:111 family=2 entries=19 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:29.190125 kernel: audit: type=1300 audit(1769193089.181:542): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffd178eed80 a2=0 a3=7ffd178eed6c items=0 ppid=2960 pid=3286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.181000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:29.197199 kernel: audit: type=1327 audit(1769193089.181:542): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:29.197000 audit[3286]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:29.202220 kernel: audit: type=1325 audit(1769193089.197:543): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:29.197000 audit[3286]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd178eed80 a2=0 a3=0 items=0 ppid=2960 pid=3286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:29.197000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:30.159000 audit[3289]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:30.159000 audit[3289]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffca29f0db0 a2=0 a3=7ffca29f0d9c items=0 ppid=2960 pid=3289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.159000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:30.182888 systemd[1]: Created slice kubepods-besteffort-pod27240d2a_1677_4832_ac22_2b16d27bde43.slice - libcontainer container kubepods-besteffort-pod27240d2a_1677_4832_ac22_2b16d27bde43.slice. Jan 23 18:31:30.197000 audit[3289]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:30.197000 audit[3289]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffca29f0db0 a2=0 a3=0 items=0 ppid=2960 pid=3289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.197000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:30.239129 kubelet[2852]: I0123 18:31:30.239055 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27240d2a-1677-4832-ac22-2b16d27bde43-tigera-ca-bundle\") pod \"calico-typha-dd56bb886-zgrqd\" (UID: \"27240d2a-1677-4832-ac22-2b16d27bde43\") " pod="calico-system/calico-typha-dd56bb886-zgrqd" Jan 23 18:31:30.239854 kubelet[2852]: I0123 18:31:30.239766 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/27240d2a-1677-4832-ac22-2b16d27bde43-typha-certs\") pod \"calico-typha-dd56bb886-zgrqd\" (UID: \"27240d2a-1677-4832-ac22-2b16d27bde43\") " pod="calico-system/calico-typha-dd56bb886-zgrqd" Jan 23 18:31:30.239854 kubelet[2852]: I0123 18:31:30.239799 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqphp\" (UniqueName: \"kubernetes.io/projected/27240d2a-1677-4832-ac22-2b16d27bde43-kube-api-access-lqphp\") pod \"calico-typha-dd56bb886-zgrqd\" (UID: \"27240d2a-1677-4832-ac22-2b16d27bde43\") " pod="calico-system/calico-typha-dd56bb886-zgrqd" Jan 23 18:31:30.441323 kubelet[2852]: I0123 18:31:30.441134 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-cni-log-dir\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.441323 kubelet[2852]: I0123 18:31:30.441181 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-lib-modules\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.441323 kubelet[2852]: I0123 18:31:30.441211 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-var-lib-calico\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.441323 kubelet[2852]: I0123 18:31:30.441234 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-cni-net-dir\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.441323 kubelet[2852]: I0123 18:31:30.441332 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-tigera-ca-bundle\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.441659 kubelet[2852]: I0123 18:31:30.441356 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-flexvol-driver-host\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.441659 kubelet[2852]: I0123 18:31:30.441385 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-node-certs\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.441659 kubelet[2852]: I0123 18:31:30.441407 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-policysync\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.441659 kubelet[2852]: I0123 18:31:30.441431 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-var-run-calico\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.441659 kubelet[2852]: I0123 18:31:30.441476 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzbxf\" (UniqueName: \"kubernetes.io/projected/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-kube-api-access-zzbxf\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.445603 kubelet[2852]: I0123 18:31:30.441500 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-cni-bin-dir\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.445603 kubelet[2852]: I0123 18:31:30.441525 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d9e66a6f-ae24-469d-8bd9-2635bb4dc275-xtables-lock\") pod \"calico-node-bk2m5\" (UID: \"d9e66a6f-ae24-469d-8bd9-2635bb4dc275\") " pod="calico-system/calico-node-bk2m5" Jan 23 18:31:30.449124 systemd[1]: Created slice kubepods-besteffort-podd9e66a6f_ae24_469d_8bd9_2635bb4dc275.slice - libcontainer container kubepods-besteffort-podd9e66a6f_ae24_469d_8bd9_2635bb4dc275.slice. Jan 23 18:31:30.493153 containerd[1684]: time="2026-01-23T18:31:30.493037799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-dd56bb886-zgrqd,Uid:27240d2a-1677-4832-ac22-2b16d27bde43,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:30.524693 containerd[1684]: time="2026-01-23T18:31:30.524547545Z" level=info msg="connecting to shim a2faf6ead9637fff2437f26a3b76ee62938784ecd7d30775468cad14088a8433" address="unix:///run/containerd/s/41e04b68cf33a4eeff788ee1cbeb1826ac28195ed9683aaa57ce514da245f62d" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:30.569349 kubelet[2852]: E0123 18:31:30.569208 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.569349 kubelet[2852]: W0123 18:31:30.569235 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.569774 kubelet[2852]: E0123 18:31:30.569583 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.573275 kubelet[2852]: E0123 18:31:30.572901 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.573275 kubelet[2852]: W0123 18:31:30.572916 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.573275 kubelet[2852]: E0123 18:31:30.573093 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.593275 systemd[1]: Started cri-containerd-a2faf6ead9637fff2437f26a3b76ee62938784ecd7d30775468cad14088a8433.scope - libcontainer container a2faf6ead9637fff2437f26a3b76ee62938784ecd7d30775468cad14088a8433. Jan 23 18:31:30.618000 audit: BPF prog-id=151 op=LOAD Jan 23 18:31:30.620000 audit: BPF prog-id=152 op=LOAD Jan 23 18:31:30.620000 audit[3312]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=3301 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132666166366561643936333766666632343337663236613362373665 Jan 23 18:31:30.620000 audit: BPF prog-id=152 op=UNLOAD Jan 23 18:31:30.620000 audit[3312]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3301 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132666166366561643936333766666632343337663236613362373665 Jan 23 18:31:30.620000 audit: BPF prog-id=153 op=LOAD Jan 23 18:31:30.620000 audit[3312]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=3301 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132666166366561643936333766666632343337663236613362373665 Jan 23 18:31:30.620000 audit: BPF prog-id=154 op=LOAD Jan 23 18:31:30.620000 audit[3312]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=3301 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132666166366561643936333766666632343337663236613362373665 Jan 23 18:31:30.620000 audit: BPF prog-id=154 op=UNLOAD Jan 23 18:31:30.620000 audit[3312]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3301 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132666166366561643936333766666632343337663236613362373665 Jan 23 18:31:30.620000 audit: BPF prog-id=153 op=UNLOAD Jan 23 18:31:30.620000 audit[3312]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3301 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132666166366561643936333766666632343337663236613362373665 Jan 23 18:31:30.620000 audit: BPF prog-id=155 op=LOAD Jan 23 18:31:30.620000 audit[3312]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=3301 pid=3312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132666166366561643936333766666632343337663236613362373665 Jan 23 18:31:30.652030 kubelet[2852]: E0123 18:31:30.651917 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:30.715512 containerd[1684]: time="2026-01-23T18:31:30.715386183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-dd56bb886-zgrqd,Uid:27240d2a-1677-4832-ac22-2b16d27bde43,Namespace:calico-system,Attempt:0,} returns sandbox id \"a2faf6ead9637fff2437f26a3b76ee62938784ecd7d30775468cad14088a8433\"" Jan 23 18:31:30.718822 containerd[1684]: time="2026-01-23T18:31:30.718791013Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 23 18:31:30.740567 kubelet[2852]: E0123 18:31:30.740508 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.740567 kubelet[2852]: W0123 18:31:30.740540 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.740567 kubelet[2852]: E0123 18:31:30.740569 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.741162 kubelet[2852]: E0123 18:31:30.741118 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.741162 kubelet[2852]: W0123 18:31:30.741132 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.741162 kubelet[2852]: E0123 18:31:30.741140 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.741828 kubelet[2852]: E0123 18:31:30.741796 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.741828 kubelet[2852]: W0123 18:31:30.741809 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.741828 kubelet[2852]: E0123 18:31:30.741818 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.742307 kubelet[2852]: E0123 18:31:30.742289 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.742307 kubelet[2852]: W0123 18:31:30.742302 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.742307 kubelet[2852]: E0123 18:31:30.742310 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.742604 kubelet[2852]: E0123 18:31:30.742585 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.742604 kubelet[2852]: W0123 18:31:30.742596 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.742604 kubelet[2852]: E0123 18:31:30.742602 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.742938 kubelet[2852]: E0123 18:31:30.742915 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.742938 kubelet[2852]: W0123 18:31:30.742927 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.742938 kubelet[2852]: E0123 18:31:30.742934 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.743304 kubelet[2852]: E0123 18:31:30.743273 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.743304 kubelet[2852]: W0123 18:31:30.743294 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.743304 kubelet[2852]: E0123 18:31:30.743301 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.743690 kubelet[2852]: E0123 18:31:30.743650 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.743690 kubelet[2852]: W0123 18:31:30.743662 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.743690 kubelet[2852]: E0123 18:31:30.743670 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.743853 kubelet[2852]: E0123 18:31:30.743828 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.743853 kubelet[2852]: W0123 18:31:30.743839 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.743853 kubelet[2852]: E0123 18:31:30.743847 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.744112 kubelet[2852]: E0123 18:31:30.744055 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.744112 kubelet[2852]: W0123 18:31:30.744061 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.744112 kubelet[2852]: E0123 18:31:30.744067 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.744212 kubelet[2852]: E0123 18:31:30.744207 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.744212 kubelet[2852]: W0123 18:31:30.744213 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.744284 kubelet[2852]: E0123 18:31:30.744218 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.744365 kubelet[2852]: E0123 18:31:30.744354 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.744365 kubelet[2852]: W0123 18:31:30.744362 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.744414 kubelet[2852]: E0123 18:31:30.744368 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.744538 kubelet[2852]: E0123 18:31:30.744526 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.744538 kubelet[2852]: W0123 18:31:30.744535 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.744538 kubelet[2852]: E0123 18:31:30.744541 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.745001 kubelet[2852]: E0123 18:31:30.744841 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.745001 kubelet[2852]: W0123 18:31:30.744852 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.745001 kubelet[2852]: E0123 18:31:30.744859 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.745088 kubelet[2852]: E0123 18:31:30.745009 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.745088 kubelet[2852]: W0123 18:31:30.745016 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.745088 kubelet[2852]: E0123 18:31:30.745023 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.745251 kubelet[2852]: E0123 18:31:30.745156 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.745251 kubelet[2852]: W0123 18:31:30.745167 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.745251 kubelet[2852]: E0123 18:31:30.745175 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.745323 kubelet[2852]: E0123 18:31:30.745309 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.745323 kubelet[2852]: W0123 18:31:30.745315 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.745323 kubelet[2852]: E0123 18:31:30.745321 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.745476 kubelet[2852]: E0123 18:31:30.745441 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.745476 kubelet[2852]: W0123 18:31:30.745460 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.745476 kubelet[2852]: E0123 18:31:30.745466 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.745639 kubelet[2852]: E0123 18:31:30.745595 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.745639 kubelet[2852]: W0123 18:31:30.745603 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.745639 kubelet[2852]: E0123 18:31:30.745609 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.745817 kubelet[2852]: E0123 18:31:30.745731 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.745817 kubelet[2852]: W0123 18:31:30.745739 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.745817 kubelet[2852]: E0123 18:31:30.745745 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.746998 kubelet[2852]: E0123 18:31:30.746624 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.746998 kubelet[2852]: W0123 18:31:30.746639 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.746998 kubelet[2852]: E0123 18:31:30.746647 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.746998 kubelet[2852]: I0123 18:31:30.746667 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f61eaf28-593a-461f-8945-a34eecb93534-registration-dir\") pod \"csi-node-driver-6z82s\" (UID: \"f61eaf28-593a-461f-8945-a34eecb93534\") " pod="calico-system/csi-node-driver-6z82s" Jan 23 18:31:30.746998 kubelet[2852]: E0123 18:31:30.746808 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.746998 kubelet[2852]: W0123 18:31:30.746816 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.746998 kubelet[2852]: E0123 18:31:30.746823 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.746998 kubelet[2852]: I0123 18:31:30.746832 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f61eaf28-593a-461f-8945-a34eecb93534-varrun\") pod \"csi-node-driver-6z82s\" (UID: \"f61eaf28-593a-461f-8945-a34eecb93534\") " pod="calico-system/csi-node-driver-6z82s" Jan 23 18:31:30.747198 kubelet[2852]: E0123 18:31:30.747054 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.747198 kubelet[2852]: W0123 18:31:30.747078 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.747198 kubelet[2852]: E0123 18:31:30.747085 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.747198 kubelet[2852]: I0123 18:31:30.747099 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc26w\" (UniqueName: \"kubernetes.io/projected/f61eaf28-593a-461f-8945-a34eecb93534-kube-api-access-sc26w\") pod \"csi-node-driver-6z82s\" (UID: \"f61eaf28-593a-461f-8945-a34eecb93534\") " pod="calico-system/csi-node-driver-6z82s" Jan 23 18:31:30.747287 kubelet[2852]: E0123 18:31:30.747271 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.747287 kubelet[2852]: W0123 18:31:30.747278 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.747287 kubelet[2852]: E0123 18:31:30.747284 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.747351 kubelet[2852]: I0123 18:31:30.747294 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f61eaf28-593a-461f-8945-a34eecb93534-socket-dir\") pod \"csi-node-driver-6z82s\" (UID: \"f61eaf28-593a-461f-8945-a34eecb93534\") " pod="calico-system/csi-node-driver-6z82s" Jan 23 18:31:30.748576 kubelet[2852]: E0123 18:31:30.747510 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.748576 kubelet[2852]: W0123 18:31:30.747521 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.748576 kubelet[2852]: E0123 18:31:30.747530 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.748576 kubelet[2852]: I0123 18:31:30.747541 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f61eaf28-593a-461f-8945-a34eecb93534-kubelet-dir\") pod \"csi-node-driver-6z82s\" (UID: \"f61eaf28-593a-461f-8945-a34eecb93534\") " pod="calico-system/csi-node-driver-6z82s" Jan 23 18:31:30.748842 kubelet[2852]: E0123 18:31:30.748628 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.748842 kubelet[2852]: W0123 18:31:30.748637 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.748842 kubelet[2852]: E0123 18:31:30.748649 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.749994 kubelet[2852]: E0123 18:31:30.749017 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.749994 kubelet[2852]: W0123 18:31:30.749030 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.749994 kubelet[2852]: E0123 18:31:30.749042 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.749994 kubelet[2852]: E0123 18:31:30.749658 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.749994 kubelet[2852]: W0123 18:31:30.749667 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.749994 kubelet[2852]: E0123 18:31:30.749674 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.749994 kubelet[2852]: E0123 18:31:30.749866 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.749994 kubelet[2852]: W0123 18:31:30.749874 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.749994 kubelet[2852]: E0123 18:31:30.749880 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.750218 kubelet[2852]: E0123 18:31:30.750104 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.750218 kubelet[2852]: W0123 18:31:30.750111 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.750218 kubelet[2852]: E0123 18:31:30.750118 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.750987 kubelet[2852]: E0123 18:31:30.750275 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.750987 kubelet[2852]: W0123 18:31:30.750287 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.750987 kubelet[2852]: E0123 18:31:30.750295 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.750987 kubelet[2852]: E0123 18:31:30.750476 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.750987 kubelet[2852]: W0123 18:31:30.750482 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.750987 kubelet[2852]: E0123 18:31:30.750506 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.750987 kubelet[2852]: E0123 18:31:30.750646 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.750987 kubelet[2852]: W0123 18:31:30.750651 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.750987 kubelet[2852]: E0123 18:31:30.750677 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.750987 kubelet[2852]: E0123 18:31:30.750841 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.751202 kubelet[2852]: W0123 18:31:30.750846 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.751202 kubelet[2852]: E0123 18:31:30.750855 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.751202 kubelet[2852]: E0123 18:31:30.751024 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.751202 kubelet[2852]: W0123 18:31:30.751030 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.751202 kubelet[2852]: E0123 18:31:30.751039 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.756795 containerd[1684]: time="2026-01-23T18:31:30.756758402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bk2m5,Uid:d9e66a6f-ae24-469d-8bd9-2635bb4dc275,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:30.779547 containerd[1684]: time="2026-01-23T18:31:30.779504410Z" level=info msg="connecting to shim b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49" address="unix:///run/containerd/s/8a1760322a4f03d96cf542910c0595ae697dee76f40b258d936a36060bdc7ddb" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:30.802138 systemd[1]: Started cri-containerd-b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49.scope - libcontainer container b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49. Jan 23 18:31:30.812000 audit: BPF prog-id=156 op=LOAD Jan 23 18:31:30.813000 audit: BPF prog-id=157 op=LOAD Jan 23 18:31:30.813000 audit[3406]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=3394 pid=3406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623262306239636262346234396633633336656430653565346532 Jan 23 18:31:30.813000 audit: BPF prog-id=157 op=UNLOAD Jan 23 18:31:30.813000 audit[3406]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623262306239636262346234396633633336656430653565346532 Jan 23 18:31:30.813000 audit: BPF prog-id=158 op=LOAD Jan 23 18:31:30.813000 audit[3406]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=3394 pid=3406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623262306239636262346234396633633336656430653565346532 Jan 23 18:31:30.813000 audit: BPF prog-id=159 op=LOAD Jan 23 18:31:30.813000 audit[3406]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=3394 pid=3406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623262306239636262346234396633633336656430653565346532 Jan 23 18:31:30.813000 audit: BPF prog-id=159 op=UNLOAD Jan 23 18:31:30.813000 audit[3406]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623262306239636262346234396633633336656430653565346532 Jan 23 18:31:30.813000 audit: BPF prog-id=158 op=UNLOAD Jan 23 18:31:30.813000 audit[3406]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623262306239636262346234396633633336656430653565346532 Jan 23 18:31:30.814000 audit: BPF prog-id=160 op=LOAD Jan 23 18:31:30.814000 audit[3406]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=3394 pid=3406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:30.814000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234623262306239636262346234396633633336656430653565346532 Jan 23 18:31:30.831372 containerd[1684]: time="2026-01-23T18:31:30.831316267Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bk2m5,Uid:d9e66a6f-ae24-469d-8bd9-2635bb4dc275,Namespace:calico-system,Attempt:0,} returns sandbox id \"b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49\"" Jan 23 18:31:30.848235 kubelet[2852]: E0123 18:31:30.848196 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.848235 kubelet[2852]: W0123 18:31:30.848213 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.848235 kubelet[2852]: E0123 18:31:30.848229 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.848486 kubelet[2852]: E0123 18:31:30.848465 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.848486 kubelet[2852]: W0123 18:31:30.848476 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.848539 kubelet[2852]: E0123 18:31:30.848493 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.848719 kubelet[2852]: E0123 18:31:30.848700 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.848719 kubelet[2852]: W0123 18:31:30.848711 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.848772 kubelet[2852]: E0123 18:31:30.848730 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.848959 kubelet[2852]: E0123 18:31:30.848941 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.848959 kubelet[2852]: W0123 18:31:30.848951 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.849018 kubelet[2852]: E0123 18:31:30.848968 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.849236 kubelet[2852]: E0123 18:31:30.849219 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.849236 kubelet[2852]: W0123 18:31:30.849229 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.849297 kubelet[2852]: E0123 18:31:30.849239 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.849480 kubelet[2852]: E0123 18:31:30.849433 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.849480 kubelet[2852]: W0123 18:31:30.849445 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.849528 kubelet[2852]: E0123 18:31:30.849485 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.849714 kubelet[2852]: E0123 18:31:30.849694 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.849714 kubelet[2852]: W0123 18:31:30.849706 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.849760 kubelet[2852]: E0123 18:31:30.849722 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.849994 kubelet[2852]: E0123 18:31:30.849966 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.849994 kubelet[2852]: W0123 18:31:30.849990 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.850038 kubelet[2852]: E0123 18:31:30.850010 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.850246 kubelet[2852]: E0123 18:31:30.850227 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.850246 kubelet[2852]: W0123 18:31:30.850238 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.850335 kubelet[2852]: E0123 18:31:30.850314 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.850477 kubelet[2852]: E0123 18:31:30.850464 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.850477 kubelet[2852]: W0123 18:31:30.850474 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.850552 kubelet[2852]: E0123 18:31:30.850541 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.850707 kubelet[2852]: E0123 18:31:30.850687 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.850707 kubelet[2852]: W0123 18:31:30.850698 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.850798 kubelet[2852]: E0123 18:31:30.850779 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.850965 kubelet[2852]: E0123 18:31:30.850947 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.850965 kubelet[2852]: W0123 18:31:30.850958 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.851162 kubelet[2852]: E0123 18:31:30.851085 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.851219 kubelet[2852]: E0123 18:31:30.851198 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.851219 kubelet[2852]: W0123 18:31:30.851208 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.851303 kubelet[2852]: E0123 18:31:30.851285 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.851463 kubelet[2852]: E0123 18:31:30.851439 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.851463 kubelet[2852]: W0123 18:31:30.851458 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.851511 kubelet[2852]: E0123 18:31:30.851467 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.851706 kubelet[2852]: E0123 18:31:30.851681 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.851706 kubelet[2852]: W0123 18:31:30.851691 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.851850 kubelet[2852]: E0123 18:31:30.851776 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.851913 kubelet[2852]: E0123 18:31:30.851895 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.851913 kubelet[2852]: W0123 18:31:30.851906 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.852029 kubelet[2852]: E0123 18:31:30.852010 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.852168 kubelet[2852]: E0123 18:31:30.852149 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.852168 kubelet[2852]: W0123 18:31:30.852159 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.852255 kubelet[2852]: E0123 18:31:30.852237 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.852415 kubelet[2852]: E0123 18:31:30.852394 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.852415 kubelet[2852]: W0123 18:31:30.852404 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.852495 kubelet[2852]: E0123 18:31:30.852482 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.852659 kubelet[2852]: E0123 18:31:30.852641 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.852659 kubelet[2852]: W0123 18:31:30.852653 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.852701 kubelet[2852]: E0123 18:31:30.852670 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.852917 kubelet[2852]: E0123 18:31:30.852898 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.852917 kubelet[2852]: W0123 18:31:30.852909 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.852951 kubelet[2852]: E0123 18:31:30.852918 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.853232 kubelet[2852]: E0123 18:31:30.853186 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.853232 kubelet[2852]: W0123 18:31:30.853225 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.853290 kubelet[2852]: E0123 18:31:30.853251 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.853489 kubelet[2852]: E0123 18:31:30.853467 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.853489 kubelet[2852]: W0123 18:31:30.853476 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.853528 kubelet[2852]: E0123 18:31:30.853508 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.853775 kubelet[2852]: E0123 18:31:30.853755 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.853775 kubelet[2852]: W0123 18:31:30.853766 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.854035 kubelet[2852]: E0123 18:31:30.853870 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.854122 kubelet[2852]: E0123 18:31:30.854104 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.854122 kubelet[2852]: W0123 18:31:30.854116 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.854155 kubelet[2852]: E0123 18:31:30.854123 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.854902 kubelet[2852]: E0123 18:31:30.854880 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.854902 kubelet[2852]: W0123 18:31:30.854894 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.854902 kubelet[2852]: E0123 18:31:30.854902 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:30.862541 kubelet[2852]: E0123 18:31:30.862505 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:30.862541 kubelet[2852]: W0123 18:31:30.862519 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:30.862756 kubelet[2852]: E0123 18:31:30.862723 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:31.217000 audit[3459]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3459 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:31.217000 audit[3459]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffdb49da240 a2=0 a3=7ffdb49da22c items=0 ppid=2960 pid=3459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:31.217000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:31.228000 audit[3459]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3459 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:31.228000 audit[3459]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdb49da240 a2=0 a3=0 items=0 ppid=2960 pid=3459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:31.228000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:31.911250 kubelet[2852]: E0123 18:31:31.911154 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:32.576686 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount472349687.mount: Deactivated successfully. Jan 23 18:31:33.911334 kubelet[2852]: E0123 18:31:33.910909 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:34.106648 containerd[1684]: time="2026-01-23T18:31:34.106584382Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:34.107953 containerd[1684]: time="2026-01-23T18:31:34.107926357Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 23 18:31:34.109184 containerd[1684]: time="2026-01-23T18:31:34.109127869Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:34.110650 containerd[1684]: time="2026-01-23T18:31:34.110633067Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:34.111231 containerd[1684]: time="2026-01-23T18:31:34.110935692Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 3.392115888s" Jan 23 18:31:34.111231 containerd[1684]: time="2026-01-23T18:31:34.110958292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 23 18:31:34.112912 containerd[1684]: time="2026-01-23T18:31:34.112893078Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 23 18:31:34.123024 containerd[1684]: time="2026-01-23T18:31:34.122996384Z" level=info msg="CreateContainer within sandbox \"a2faf6ead9637fff2437f26a3b76ee62938784ecd7d30775468cad14088a8433\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 23 18:31:34.132358 containerd[1684]: time="2026-01-23T18:31:34.129736558Z" level=info msg="Container 480406fe25e5a6f73dd736d7107c1fdd780d81e606acbf784dc4bf3fdde90701: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:34.141038 containerd[1684]: time="2026-01-23T18:31:34.140995914Z" level=info msg="CreateContainer within sandbox \"a2faf6ead9637fff2437f26a3b76ee62938784ecd7d30775468cad14088a8433\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"480406fe25e5a6f73dd736d7107c1fdd780d81e606acbf784dc4bf3fdde90701\"" Jan 23 18:31:34.141533 containerd[1684]: time="2026-01-23T18:31:34.141517924Z" level=info msg="StartContainer for \"480406fe25e5a6f73dd736d7107c1fdd780d81e606acbf784dc4bf3fdde90701\"" Jan 23 18:31:34.142556 containerd[1684]: time="2026-01-23T18:31:34.142496852Z" level=info msg="connecting to shim 480406fe25e5a6f73dd736d7107c1fdd780d81e606acbf784dc4bf3fdde90701" address="unix:///run/containerd/s/41e04b68cf33a4eeff788ee1cbeb1826ac28195ed9683aaa57ce514da245f62d" protocol=ttrpc version=3 Jan 23 18:31:34.165118 systemd[1]: Started cri-containerd-480406fe25e5a6f73dd736d7107c1fdd780d81e606acbf784dc4bf3fdde90701.scope - libcontainer container 480406fe25e5a6f73dd736d7107c1fdd780d81e606acbf784dc4bf3fdde90701. Jan 23 18:31:34.179000 audit: BPF prog-id=161 op=LOAD Jan 23 18:31:34.181863 kernel: kauditd_printk_skb: 58 callbacks suppressed Jan 23 18:31:34.181895 kernel: audit: type=1334 audit(1769193094.179:564): prog-id=161 op=LOAD Jan 23 18:31:34.185372 kernel: audit: type=1334 audit(1769193094.180:565): prog-id=162 op=LOAD Jan 23 18:31:34.180000 audit: BPF prog-id=162 op=LOAD Jan 23 18:31:34.196136 kernel: audit: type=1300 audit(1769193094.180:565): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.180000 audit[3471]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.203517 kernel: audit: type=1327 audit(1769193094.180:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.206541 kernel: audit: type=1334 audit(1769193094.180:566): prog-id=162 op=UNLOAD Jan 23 18:31:34.180000 audit: BPF prog-id=162 op=UNLOAD Jan 23 18:31:34.180000 audit[3471]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.217832 kernel: audit: type=1300 audit(1769193094.180:566): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.217863 kernel: audit: type=1327 audit(1769193094.180:566): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.180000 audit: BPF prog-id=163 op=LOAD Jan 23 18:31:34.224055 kernel: audit: type=1334 audit(1769193094.180:567): prog-id=163 op=LOAD Jan 23 18:31:34.180000 audit[3471]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.228731 kernel: audit: type=1300 audit(1769193094.180:567): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.180000 audit: BPF prog-id=164 op=LOAD Jan 23 18:31:34.242052 kernel: audit: type=1327 audit(1769193094.180:567): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.180000 audit[3471]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.180000 audit: BPF prog-id=164 op=UNLOAD Jan 23 18:31:34.180000 audit[3471]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.180000 audit: BPF prog-id=163 op=UNLOAD Jan 23 18:31:34.180000 audit[3471]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.180000 audit: BPF prog-id=165 op=LOAD Jan 23 18:31:34.180000 audit[3471]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=3301 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:34.180000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3438303430366665323565356136663733646437333664373130376331 Jan 23 18:31:34.245449 containerd[1684]: time="2026-01-23T18:31:34.245403494Z" level=info msg="StartContainer for \"480406fe25e5a6f73dd736d7107c1fdd780d81e606acbf784dc4bf3fdde90701\" returns successfully" Jan 23 18:31:35.025199 kubelet[2852]: I0123 18:31:35.024948 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-dd56bb886-zgrqd" podStartSLOduration=1.631584454 podStartE2EDuration="5.024933767s" podCreationTimestamp="2026-01-23 18:31:30 +0000 UTC" firstStartedPulling="2026-01-23 18:31:30.718347033 +0000 UTC m=+20.911380973" lastFinishedPulling="2026-01-23 18:31:34.111696356 +0000 UTC m=+24.304730286" observedRunningTime="2026-01-23 18:31:35.024686592 +0000 UTC m=+25.217720542" watchObservedRunningTime="2026-01-23 18:31:35.024933767 +0000 UTC m=+25.217967697" Jan 23 18:31:35.077409 kubelet[2852]: E0123 18:31:35.077323 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.077409 kubelet[2852]: W0123 18:31:35.077378 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.077778 kubelet[2852]: E0123 18:31:35.077668 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.077986 kubelet[2852]: E0123 18:31:35.077959 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.078060 kubelet[2852]: W0123 18:31:35.078049 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.078128 kubelet[2852]: E0123 18:31:35.078102 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.078451 kubelet[2852]: E0123 18:31:35.078426 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.078599 kubelet[2852]: W0123 18:31:35.078521 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.078599 kubelet[2852]: E0123 18:31:35.078533 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.078994 kubelet[2852]: E0123 18:31:35.078880 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.078994 kubelet[2852]: W0123 18:31:35.078890 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.078994 kubelet[2852]: E0123 18:31:35.078898 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.079258 kubelet[2852]: E0123 18:31:35.079227 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.079258 kubelet[2852]: W0123 18:31:35.079239 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.079365 kubelet[2852]: E0123 18:31:35.079324 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.079601 kubelet[2852]: E0123 18:31:35.079591 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.079739 kubelet[2852]: W0123 18:31:35.079662 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.079739 kubelet[2852]: E0123 18:31:35.079673 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.079989 kubelet[2852]: E0123 18:31:35.079945 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.079989 kubelet[2852]: W0123 18:31:35.079955 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.079989 kubelet[2852]: E0123 18:31:35.079963 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.080327 kubelet[2852]: E0123 18:31:35.080275 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.080327 kubelet[2852]: W0123 18:31:35.080285 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.080327 kubelet[2852]: E0123 18:31:35.080293 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.080708 kubelet[2852]: E0123 18:31:35.080620 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.080708 kubelet[2852]: W0123 18:31:35.080630 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.080708 kubelet[2852]: E0123 18:31:35.080638 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.081071 kubelet[2852]: E0123 18:31:35.081038 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.081116 kubelet[2852]: W0123 18:31:35.081066 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.081116 kubelet[2852]: E0123 18:31:35.081096 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.081540 kubelet[2852]: E0123 18:31:35.081504 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.081540 kubelet[2852]: W0123 18:31:35.081524 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.081540 kubelet[2852]: E0123 18:31:35.081539 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.081930 kubelet[2852]: E0123 18:31:35.081900 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.081930 kubelet[2852]: W0123 18:31:35.081921 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.082151 kubelet[2852]: E0123 18:31:35.081936 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.082454 kubelet[2852]: E0123 18:31:35.082343 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.082454 kubelet[2852]: W0123 18:31:35.082355 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.082454 kubelet[2852]: E0123 18:31:35.082366 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.082683 kubelet[2852]: E0123 18:31:35.082672 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.082827 kubelet[2852]: W0123 18:31:35.082735 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.082827 kubelet[2852]: E0123 18:31:35.082747 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.083067 kubelet[2852]: E0123 18:31:35.083057 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.083170 kubelet[2852]: W0123 18:31:35.083115 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.083170 kubelet[2852]: E0123 18:31:35.083125 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.083502 kubelet[2852]: E0123 18:31:35.083481 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.083502 kubelet[2852]: W0123 18:31:35.083496 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.083502 kubelet[2852]: E0123 18:31:35.083505 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.083764 kubelet[2852]: E0123 18:31:35.083717 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.083764 kubelet[2852]: W0123 18:31:35.083725 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.083908 kubelet[2852]: E0123 18:31:35.083787 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.084216 kubelet[2852]: E0123 18:31:35.084182 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.084216 kubelet[2852]: W0123 18:31:35.084208 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.084350 kubelet[2852]: E0123 18:31:35.084244 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.084695 kubelet[2852]: E0123 18:31:35.084668 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.084695 kubelet[2852]: W0123 18:31:35.084689 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.084761 kubelet[2852]: E0123 18:31:35.084717 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.085256 kubelet[2852]: E0123 18:31:35.085227 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.085256 kubelet[2852]: W0123 18:31:35.085249 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.085440 kubelet[2852]: E0123 18:31:35.085377 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.085694 kubelet[2852]: E0123 18:31:35.085667 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.085694 kubelet[2852]: W0123 18:31:35.085687 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.085828 kubelet[2852]: E0123 18:31:35.085792 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.086335 kubelet[2852]: E0123 18:31:35.086230 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.086335 kubelet[2852]: W0123 18:31:35.086244 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.086403 kubelet[2852]: E0123 18:31:35.086342 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.086596 kubelet[2852]: E0123 18:31:35.086561 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.086596 kubelet[2852]: W0123 18:31:35.086578 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.086738 kubelet[2852]: E0123 18:31:35.086676 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.086958 kubelet[2852]: E0123 18:31:35.086922 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.086958 kubelet[2852]: W0123 18:31:35.086935 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.086958 kubelet[2852]: E0123 18:31:35.086947 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.087672 kubelet[2852]: E0123 18:31:35.087641 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.087710 kubelet[2852]: W0123 18:31:35.087672 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.087942 kubelet[2852]: E0123 18:31:35.087876 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.088124 kubelet[2852]: E0123 18:31:35.088099 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.088179 kubelet[2852]: W0123 18:31:35.088122 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.088221 kubelet[2852]: E0123 18:31:35.088207 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.088527 kubelet[2852]: E0123 18:31:35.088500 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.088550 kubelet[2852]: W0123 18:31:35.088525 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.088592 kubelet[2852]: E0123 18:31:35.088556 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.088946 kubelet[2852]: E0123 18:31:35.088922 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.088946 kubelet[2852]: W0123 18:31:35.088942 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.089042 kubelet[2852]: E0123 18:31:35.089021 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.089403 kubelet[2852]: E0123 18:31:35.089380 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.089403 kubelet[2852]: W0123 18:31:35.089400 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.089511 kubelet[2852]: E0123 18:31:35.089425 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.089642 kubelet[2852]: E0123 18:31:35.089631 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.089752 kubelet[2852]: W0123 18:31:35.089672 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.089752 kubelet[2852]: E0123 18:31:35.089683 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.089881 kubelet[2852]: E0123 18:31:35.089874 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.089910 kubelet[2852]: W0123 18:31:35.089903 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.089938 kubelet[2852]: E0123 18:31:35.089931 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.090460 kubelet[2852]: E0123 18:31:35.090435 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.090460 kubelet[2852]: W0123 18:31:35.090456 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.090544 kubelet[2852]: E0123 18:31:35.090482 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.090863 kubelet[2852]: E0123 18:31:35.090839 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:35.090863 kubelet[2852]: W0123 18:31:35.090859 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:35.090901 kubelet[2852]: E0123 18:31:35.090873 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:35.911070 kubelet[2852]: E0123 18:31:35.910989 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:36.005452 containerd[1684]: time="2026-01-23T18:31:36.004987127Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:36.005885 containerd[1684]: time="2026-01-23T18:31:36.005869161Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:36.007167 containerd[1684]: time="2026-01-23T18:31:36.007107501Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:36.009622 containerd[1684]: time="2026-01-23T18:31:36.009606742Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:36.010360 containerd[1684]: time="2026-01-23T18:31:36.010312293Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.897351514s" Jan 23 18:31:36.010439 containerd[1684]: time="2026-01-23T18:31:36.010365654Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 23 18:31:36.013412 containerd[1684]: time="2026-01-23T18:31:36.013387023Z" level=info msg="CreateContainer within sandbox \"b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 23 18:31:36.017522 kubelet[2852]: I0123 18:31:36.017506 2852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 18:31:36.027058 containerd[1684]: time="2026-01-23T18:31:36.026962943Z" level=info msg="Container 05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:36.035110 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1228522413.mount: Deactivated successfully. Jan 23 18:31:36.040101 containerd[1684]: time="2026-01-23T18:31:36.040037414Z" level=info msg="CreateContainer within sandbox \"b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064\"" Jan 23 18:31:36.041382 containerd[1684]: time="2026-01-23T18:31:36.041351956Z" level=info msg="StartContainer for \"05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064\"" Jan 23 18:31:36.044184 containerd[1684]: time="2026-01-23T18:31:36.044143371Z" level=info msg="connecting to shim 05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064" address="unix:///run/containerd/s/8a1760322a4f03d96cf542910c0595ae697dee76f40b258d936a36060bdc7ddb" protocol=ttrpc version=3 Jan 23 18:31:36.086237 systemd[1]: Started cri-containerd-05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064.scope - libcontainer container 05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064. Jan 23 18:31:36.089493 kubelet[2852]: E0123 18:31:36.089451 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.089493 kubelet[2852]: W0123 18:31:36.089483 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.091560 kubelet[2852]: E0123 18:31:36.089507 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.091560 kubelet[2852]: E0123 18:31:36.089837 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.091560 kubelet[2852]: W0123 18:31:36.089848 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.091560 kubelet[2852]: E0123 18:31:36.089860 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.091560 kubelet[2852]: E0123 18:31:36.090200 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.091560 kubelet[2852]: W0123 18:31:36.090211 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.091560 kubelet[2852]: E0123 18:31:36.090222 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.091560 kubelet[2852]: E0123 18:31:36.090656 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.091560 kubelet[2852]: W0123 18:31:36.090667 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.091560 kubelet[2852]: E0123 18:31:36.090679 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.092465 kubelet[2852]: E0123 18:31:36.091459 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.092465 kubelet[2852]: W0123 18:31:36.091471 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.092465 kubelet[2852]: E0123 18:31:36.091488 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.092465 kubelet[2852]: E0123 18:31:36.091899 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.092465 kubelet[2852]: W0123 18:31:36.092020 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.092465 kubelet[2852]: E0123 18:31:36.092034 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.095102 kubelet[2852]: E0123 18:31:36.094087 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.095102 kubelet[2852]: W0123 18:31:36.094106 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.095102 kubelet[2852]: E0123 18:31:36.094126 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.095500 kubelet[2852]: E0123 18:31:36.095216 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.095500 kubelet[2852]: W0123 18:31:36.095231 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.095500 kubelet[2852]: E0123 18:31:36.095247 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.096432 kubelet[2852]: E0123 18:31:36.096400 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.096432 kubelet[2852]: W0123 18:31:36.096422 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.096432 kubelet[2852]: E0123 18:31:36.096436 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.097712 kubelet[2852]: E0123 18:31:36.097548 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.097712 kubelet[2852]: W0123 18:31:36.097580 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.097712 kubelet[2852]: E0123 18:31:36.097627 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.098556 kubelet[2852]: E0123 18:31:36.098423 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.098556 kubelet[2852]: W0123 18:31:36.098445 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.098556 kubelet[2852]: E0123 18:31:36.098464 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.099369 kubelet[2852]: E0123 18:31:36.099296 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.099369 kubelet[2852]: W0123 18:31:36.099320 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.099369 kubelet[2852]: E0123 18:31:36.099338 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.101263 kubelet[2852]: E0123 18:31:36.101070 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.101628 kubelet[2852]: W0123 18:31:36.101377 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.101628 kubelet[2852]: E0123 18:31:36.101405 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.102461 kubelet[2852]: E0123 18:31:36.102439 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.102559 kubelet[2852]: W0123 18:31:36.102541 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.102652 kubelet[2852]: E0123 18:31:36.102636 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.103224 kubelet[2852]: E0123 18:31:36.103163 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.103224 kubelet[2852]: W0123 18:31:36.103184 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.103224 kubelet[2852]: E0123 18:31:36.103199 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.104064 kubelet[2852]: E0123 18:31:36.103958 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.104064 kubelet[2852]: W0123 18:31:36.104020 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.104064 kubelet[2852]: E0123 18:31:36.104036 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.104759 kubelet[2852]: E0123 18:31:36.104666 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.104759 kubelet[2852]: W0123 18:31:36.104685 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.105135 kubelet[2852]: E0123 18:31:36.104925 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.105462 kubelet[2852]: E0123 18:31:36.105444 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.105577 kubelet[2852]: W0123 18:31:36.105549 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.105738 kubelet[2852]: E0123 18:31:36.105672 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.106222 kubelet[2852]: E0123 18:31:36.106179 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.106222 kubelet[2852]: W0123 18:31:36.106198 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.106485 kubelet[2852]: E0123 18:31:36.106366 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.106872 kubelet[2852]: E0123 18:31:36.106833 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.106872 kubelet[2852]: W0123 18:31:36.106849 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.107108 kubelet[2852]: E0123 18:31:36.107073 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.107669 kubelet[2852]: E0123 18:31:36.107628 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.107669 kubelet[2852]: W0123 18:31:36.107647 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.108204 kubelet[2852]: E0123 18:31:36.108113 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.108699 kubelet[2852]: E0123 18:31:36.108679 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.108838 kubelet[2852]: W0123 18:31:36.108765 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.109048 kubelet[2852]: E0123 18:31:36.109015 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.109793 kubelet[2852]: E0123 18:31:36.109738 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.109793 kubelet[2852]: W0123 18:31:36.109762 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.110363 kubelet[2852]: E0123 18:31:36.110211 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.110588 kubelet[2852]: E0123 18:31:36.110565 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.111398 kubelet[2852]: W0123 18:31:36.111172 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.111544 kubelet[2852]: E0123 18:31:36.111512 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.111791 kubelet[2852]: E0123 18:31:36.111766 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.112222 kubelet[2852]: W0123 18:31:36.112101 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.112366 kubelet[2852]: E0123 18:31:36.112300 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.112788 kubelet[2852]: E0123 18:31:36.112745 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.112788 kubelet[2852]: W0123 18:31:36.112763 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.113056 kubelet[2852]: E0123 18:31:36.113012 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.113471 kubelet[2852]: E0123 18:31:36.113430 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.113471 kubelet[2852]: W0123 18:31:36.113448 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.113732 kubelet[2852]: E0123 18:31:36.113694 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.114232 kubelet[2852]: E0123 18:31:36.114185 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.114232 kubelet[2852]: W0123 18:31:36.114208 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.114496 kubelet[2852]: E0123 18:31:36.114397 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.115424 kubelet[2852]: E0123 18:31:36.115403 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.115576 kubelet[2852]: W0123 18:31:36.115500 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.115893 kubelet[2852]: E0123 18:31:36.115871 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.116462 kubelet[2852]: E0123 18:31:36.116442 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.116531 kubelet[2852]: W0123 18:31:36.116514 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.116632 kubelet[2852]: E0123 18:31:36.116615 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.117131 kubelet[2852]: E0123 18:31:36.117111 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.117219 kubelet[2852]: W0123 18:31:36.117202 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.117338 kubelet[2852]: E0123 18:31:36.117321 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.117687 kubelet[2852]: E0123 18:31:36.117667 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.117782 kubelet[2852]: W0123 18:31:36.117766 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.118145 kubelet[2852]: E0123 18:31:36.118126 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.118709 kubelet[2852]: E0123 18:31:36.118690 2852 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 18:31:36.118898 kubelet[2852]: W0123 18:31:36.118771 2852 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 18:31:36.118898 kubelet[2852]: E0123 18:31:36.118787 2852 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 18:31:36.176000 audit: BPF prog-id=166 op=LOAD Jan 23 18:31:36.176000 audit[3546]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3394 pid=3546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:36.176000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035613332366230653661393738616261656663323465626436333638 Jan 23 18:31:36.176000 audit: BPF prog-id=167 op=LOAD Jan 23 18:31:36.176000 audit[3546]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3394 pid=3546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:36.176000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035613332366230653661393738616261656663323465626436333638 Jan 23 18:31:36.176000 audit: BPF prog-id=167 op=UNLOAD Jan 23 18:31:36.176000 audit[3546]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:36.176000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035613332366230653661393738616261656663323465626436333638 Jan 23 18:31:36.176000 audit: BPF prog-id=166 op=UNLOAD Jan 23 18:31:36.176000 audit[3546]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:36.176000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035613332366230653661393738616261656663323465626436333638 Jan 23 18:31:36.176000 audit: BPF prog-id=168 op=LOAD Jan 23 18:31:36.176000 audit[3546]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3394 pid=3546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:36.176000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3035613332366230653661393738616261656663323465626436333638 Jan 23 18:31:36.215160 containerd[1684]: time="2026-01-23T18:31:36.213376384Z" level=info msg="StartContainer for \"05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064\" returns successfully" Jan 23 18:31:36.215825 systemd[1]: cri-containerd-05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064.scope: Deactivated successfully. Jan 23 18:31:36.219000 audit: BPF prog-id=168 op=UNLOAD Jan 23 18:31:36.220739 containerd[1684]: time="2026-01-23T18:31:36.219964350Z" level=info msg="received container exit event container_id:\"05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064\" id:\"05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064\" pid:3562 exited_at:{seconds:1769193096 nanos:219513783}" Jan 23 18:31:36.259909 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-05a326b0e6a978abaefc24ebd6368323061f91ca65a456c76ed0d40b93d3f064-rootfs.mount: Deactivated successfully. Jan 23 18:31:36.898947 sshd[3208]: banner exchange: Connection from 59.52.101.28 port 61358: invalid format Jan 23 18:31:36.900531 systemd[1]: sshd@9-46.62.169.9:22-59.52.101.28:61358.service: Deactivated successfully. Jan 23 18:31:36.900000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-46.62.169.9:22-59.52.101.28:61358 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:31:37.025579 containerd[1684]: time="2026-01-23T18:31:37.025484446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 23 18:31:37.911584 kubelet[2852]: E0123 18:31:37.911099 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:39.913023 kubelet[2852]: E0123 18:31:39.911654 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:41.453473 kubelet[2852]: I0123 18:31:41.453429 2852 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 18:31:41.508018 kernel: kauditd_printk_skb: 29 callbacks suppressed Jan 23 18:31:41.508116 kernel: audit: type=1325 audit(1769193101.505:579): table=filter:117 family=2 entries=21 op=nft_register_rule pid=3635 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:41.505000 audit[3635]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3635 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:41.515492 kernel: audit: type=1300 audit(1769193101.505:579): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fffac4994b0 a2=0 a3=7fffac49949c items=0 ppid=2960 pid=3635 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.505000 audit[3635]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fffac4994b0 a2=0 a3=7fffac49949c items=0 ppid=2960 pid=3635 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.505000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:41.523026 kernel: audit: type=1327 audit(1769193101.505:579): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:41.515000 audit[3635]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3635 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:41.528053 kernel: audit: type=1325 audit(1769193101.515:580): table=nat:118 family=2 entries=19 op=nft_register_chain pid=3635 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:41.515000 audit[3635]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7fffac4994b0 a2=0 a3=7fffac49949c items=0 ppid=2960 pid=3635 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.533789 kernel: audit: type=1300 audit(1769193101.515:580): arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7fffac4994b0 a2=0 a3=7fffac49949c items=0 ppid=2960 pid=3635 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.515000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:41.544996 kernel: audit: type=1327 audit(1769193101.515:580): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:41.663237 containerd[1684]: time="2026-01-23T18:31:41.663182441Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:41.664327 containerd[1684]: time="2026-01-23T18:31:41.664209484Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 23 18:31:41.665156 containerd[1684]: time="2026-01-23T18:31:41.665133675Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:41.666682 containerd[1684]: time="2026-01-23T18:31:41.666657113Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:41.667136 containerd[1684]: time="2026-01-23T18:31:41.667105118Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 4.641566871s" Jan 23 18:31:41.667186 containerd[1684]: time="2026-01-23T18:31:41.667176189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 23 18:31:41.669415 containerd[1684]: time="2026-01-23T18:31:41.669382055Z" level=info msg="CreateContainer within sandbox \"b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 23 18:31:41.681380 containerd[1684]: time="2026-01-23T18:31:41.681252366Z" level=info msg="Container cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:41.691987 containerd[1684]: time="2026-01-23T18:31:41.690445805Z" level=info msg="CreateContainer within sandbox \"b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63\"" Jan 23 18:31:41.693160 containerd[1684]: time="2026-01-23T18:31:41.693118556Z" level=info msg="StartContainer for \"cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63\"" Jan 23 18:31:41.694916 containerd[1684]: time="2026-01-23T18:31:41.694897437Z" level=info msg="connecting to shim cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63" address="unix:///run/containerd/s/8a1760322a4f03d96cf542910c0595ae697dee76f40b258d936a36060bdc7ddb" protocol=ttrpc version=3 Jan 23 18:31:41.713107 systemd[1]: Started cri-containerd-cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63.scope - libcontainer container cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63. Jan 23 18:31:41.755000 audit: BPF prog-id=169 op=LOAD Jan 23 18:31:41.755000 audit[3640]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3394 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.761334 kernel: audit: type=1334 audit(1769193101.755:581): prog-id=169 op=LOAD Jan 23 18:31:41.761421 kernel: audit: type=1300 audit(1769193101.755:581): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3394 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364656633663334333962396132663563316662383932303536323761 Jan 23 18:31:41.768531 kernel: audit: type=1327 audit(1769193101.755:581): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364656633663334333962396132663563316662383932303536323761 Jan 23 18:31:41.755000 audit: BPF prog-id=170 op=LOAD Jan 23 18:31:41.755000 audit[3640]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3394 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364656633663334333962396132663563316662383932303536323761 Jan 23 18:31:41.755000 audit: BPF prog-id=170 op=UNLOAD Jan 23 18:31:41.755000 audit[3640]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364656633663334333962396132663563316662383932303536323761 Jan 23 18:31:41.755000 audit: BPF prog-id=169 op=UNLOAD Jan 23 18:31:41.755000 audit[3640]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.778502 kernel: audit: type=1334 audit(1769193101.755:582): prog-id=170 op=LOAD Jan 23 18:31:41.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364656633663334333962396132663563316662383932303536323761 Jan 23 18:31:41.755000 audit: BPF prog-id=171 op=LOAD Jan 23 18:31:41.755000 audit[3640]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3394 pid=3640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:41.755000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6364656633663334333962396132663563316662383932303536323761 Jan 23 18:31:41.804766 containerd[1684]: time="2026-01-23T18:31:41.804695218Z" level=info msg="StartContainer for \"cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63\" returns successfully" Jan 23 18:31:41.909826 kubelet[2852]: E0123 18:31:41.909763 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:42.421170 systemd[1]: cri-containerd-cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63.scope: Deactivated successfully. Jan 23 18:31:42.421997 systemd[1]: cri-containerd-cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63.scope: Consumed 613ms CPU time, 194.4M memory peak, 171.3M written to disk. Jan 23 18:31:42.422913 containerd[1684]: time="2026-01-23T18:31:42.422844293Z" level=info msg="received container exit event container_id:\"cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63\" id:\"cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63\" pid:3653 exited_at:{seconds:1769193102 nanos:422472359}" Jan 23 18:31:42.425000 audit: BPF prog-id=171 op=UNLOAD Jan 23 18:31:42.452356 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cdef3f3439b9a2f5c1fb89205627aacd576d26a183afce5ff1465c88c82cac63-rootfs.mount: Deactivated successfully. Jan 23 18:31:42.512536 kubelet[2852]: I0123 18:31:42.511965 2852 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 23 18:31:42.561339 systemd[1]: Created slice kubepods-besteffort-podc3e274ab_7780_47cb_bdba_8ff5c1acc9ce.slice - libcontainer container kubepods-besteffort-podc3e274ab_7780_47cb_bdba_8ff5c1acc9ce.slice. Jan 23 18:31:42.586578 systemd[1]: Created slice kubepods-besteffort-pod8bfceda8_a381_44e3_ab5c_8d6cd2f190e0.slice - libcontainer container kubepods-besteffort-pod8bfceda8_a381_44e3_ab5c_8d6cd2f190e0.slice. Jan 23 18:31:42.594681 systemd[1]: Created slice kubepods-burstable-pod43b5afc6_15bc_4463_a623_3577c228380b.slice - libcontainer container kubepods-burstable-pod43b5afc6_15bc_4463_a623_3577c228380b.slice. Jan 23 18:31:42.601383 systemd[1]: Created slice kubepods-besteffort-pod4e218aa3_82e2_43da_97a5_0f48de07a97f.slice - libcontainer container kubepods-besteffort-pod4e218aa3_82e2_43da_97a5_0f48de07a97f.slice. Jan 23 18:31:42.609764 systemd[1]: Created slice kubepods-burstable-pod87ab8ef8_50a9_48bb_8cd1_1b2e13486180.slice - libcontainer container kubepods-burstable-pod87ab8ef8_50a9_48bb_8cd1_1b2e13486180.slice. Jan 23 18:31:42.615088 systemd[1]: Created slice kubepods-besteffort-pod8cfde420_1102_4e10_b36c_f5766b852cc7.slice - libcontainer container kubepods-besteffort-pod8cfde420_1102_4e10_b36c_f5766b852cc7.slice. Jan 23 18:31:42.621923 systemd[1]: Created slice kubepods-besteffort-pod7830dd1a_252c_46ff_ba84_ba7feca691a8.slice - libcontainer container kubepods-besteffort-pod7830dd1a_252c_46ff_ba84_ba7feca691a8.slice. Jan 23 18:31:42.655998 kubelet[2852]: I0123 18:31:42.655773 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvxc8\" (UniqueName: \"kubernetes.io/projected/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-kube-api-access-rvxc8\") pod \"whisker-bcb54d7bd-b6q48\" (UID: \"c3e274ab-7780-47cb-bdba-8ff5c1acc9ce\") " pod="calico-system/whisker-bcb54d7bd-b6q48" Jan 23 18:31:42.655998 kubelet[2852]: I0123 18:31:42.655806 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8bfceda8-a381-44e3-ab5c-8d6cd2f190e0-calico-apiserver-certs\") pod \"calico-apiserver-548cd799bc-rm88n\" (UID: \"8bfceda8-a381-44e3-ab5c-8d6cd2f190e0\") " pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" Jan 23 18:31:42.655998 kubelet[2852]: I0123 18:31:42.655819 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz6z2\" (UniqueName: \"kubernetes.io/projected/87ab8ef8-50a9-48bb-8cd1-1b2e13486180-kube-api-access-hz6z2\") pod \"coredns-668d6bf9bc-nqrcz\" (UID: \"87ab8ef8-50a9-48bb-8cd1-1b2e13486180\") " pod="kube-system/coredns-668d6bf9bc-nqrcz" Jan 23 18:31:42.655998 kubelet[2852]: I0123 18:31:42.655832 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cfde420-1102-4e10-b36c-f5766b852cc7-config\") pod \"goldmane-666569f655-dw7bl\" (UID: \"8cfde420-1102-4e10-b36c-f5766b852cc7\") " pod="calico-system/goldmane-666569f655-dw7bl" Jan 23 18:31:42.655998 kubelet[2852]: I0123 18:31:42.655846 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-whisker-ca-bundle\") pod \"whisker-bcb54d7bd-b6q48\" (UID: \"c3e274ab-7780-47cb-bdba-8ff5c1acc9ce\") " pod="calico-system/whisker-bcb54d7bd-b6q48" Jan 23 18:31:42.656185 kubelet[2852]: I0123 18:31:42.655857 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87ab8ef8-50a9-48bb-8cd1-1b2e13486180-config-volume\") pod \"coredns-668d6bf9bc-nqrcz\" (UID: \"87ab8ef8-50a9-48bb-8cd1-1b2e13486180\") " pod="kube-system/coredns-668d6bf9bc-nqrcz" Jan 23 18:31:42.656185 kubelet[2852]: I0123 18:31:42.655869 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43b5afc6-15bc-4463-a623-3577c228380b-config-volume\") pod \"coredns-668d6bf9bc-rtp4c\" (UID: \"43b5afc6-15bc-4463-a623-3577c228380b\") " pod="kube-system/coredns-668d6bf9bc-rtp4c" Jan 23 18:31:42.656185 kubelet[2852]: I0123 18:31:42.655882 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-whisker-backend-key-pair\") pod \"whisker-bcb54d7bd-b6q48\" (UID: \"c3e274ab-7780-47cb-bdba-8ff5c1acc9ce\") " pod="calico-system/whisker-bcb54d7bd-b6q48" Jan 23 18:31:42.656185 kubelet[2852]: I0123 18:31:42.655893 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2whq\" (UniqueName: \"kubernetes.io/projected/8bfceda8-a381-44e3-ab5c-8d6cd2f190e0-kube-api-access-r2whq\") pod \"calico-apiserver-548cd799bc-rm88n\" (UID: \"8bfceda8-a381-44e3-ab5c-8d6cd2f190e0\") " pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" Jan 23 18:31:42.656185 kubelet[2852]: I0123 18:31:42.655904 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwzkd\" (UniqueName: \"kubernetes.io/projected/4e218aa3-82e2-43da-97a5-0f48de07a97f-kube-api-access-kwzkd\") pod \"calico-kube-controllers-69d9c854f8-4vpsg\" (UID: \"4e218aa3-82e2-43da-97a5-0f48de07a97f\") " pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" Jan 23 18:31:42.656272 kubelet[2852]: I0123 18:31:42.655917 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4nkk\" (UniqueName: \"kubernetes.io/projected/7830dd1a-252c-46ff-ba84-ba7feca691a8-kube-api-access-p4nkk\") pod \"calico-apiserver-548cd799bc-4tqh9\" (UID: \"7830dd1a-252c-46ff-ba84-ba7feca691a8\") " pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" Jan 23 18:31:42.656272 kubelet[2852]: I0123 18:31:42.655929 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8vjj\" (UniqueName: \"kubernetes.io/projected/43b5afc6-15bc-4463-a623-3577c228380b-kube-api-access-v8vjj\") pod \"coredns-668d6bf9bc-rtp4c\" (UID: \"43b5afc6-15bc-4463-a623-3577c228380b\") " pod="kube-system/coredns-668d6bf9bc-rtp4c" Jan 23 18:31:42.656272 kubelet[2852]: I0123 18:31:42.655941 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cfde420-1102-4e10-b36c-f5766b852cc7-goldmane-ca-bundle\") pod \"goldmane-666569f655-dw7bl\" (UID: \"8cfde420-1102-4e10-b36c-f5766b852cc7\") " pod="calico-system/goldmane-666569f655-dw7bl" Jan 23 18:31:42.656272 kubelet[2852]: I0123 18:31:42.655952 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7830dd1a-252c-46ff-ba84-ba7feca691a8-calico-apiserver-certs\") pod \"calico-apiserver-548cd799bc-4tqh9\" (UID: \"7830dd1a-252c-46ff-ba84-ba7feca691a8\") " pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" Jan 23 18:31:42.656464 kubelet[2852]: I0123 18:31:42.655967 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x97s9\" (UniqueName: \"kubernetes.io/projected/8cfde420-1102-4e10-b36c-f5766b852cc7-kube-api-access-x97s9\") pod \"goldmane-666569f655-dw7bl\" (UID: \"8cfde420-1102-4e10-b36c-f5766b852cc7\") " pod="calico-system/goldmane-666569f655-dw7bl" Jan 23 18:31:42.656464 kubelet[2852]: I0123 18:31:42.656390 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8cfde420-1102-4e10-b36c-f5766b852cc7-goldmane-key-pair\") pod \"goldmane-666569f655-dw7bl\" (UID: \"8cfde420-1102-4e10-b36c-f5766b852cc7\") " pod="calico-system/goldmane-666569f655-dw7bl" Jan 23 18:31:42.656464 kubelet[2852]: I0123 18:31:42.656408 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e218aa3-82e2-43da-97a5-0f48de07a97f-tigera-ca-bundle\") pod \"calico-kube-controllers-69d9c854f8-4vpsg\" (UID: \"4e218aa3-82e2-43da-97a5-0f48de07a97f\") " pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" Jan 23 18:31:42.869457 containerd[1684]: time="2026-01-23T18:31:42.869390786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bcb54d7bd-b6q48,Uid:c3e274ab-7780-47cb-bdba-8ff5c1acc9ce,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:42.894050 containerd[1684]: time="2026-01-23T18:31:42.893936689Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-548cd799bc-rm88n,Uid:8bfceda8-a381-44e3-ab5c-8d6cd2f190e0,Namespace:calico-apiserver,Attempt:0,}" Jan 23 18:31:42.901017 containerd[1684]: time="2026-01-23T18:31:42.899957356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-rtp4c,Uid:43b5afc6-15bc-4463-a623-3577c228380b,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:42.908115 containerd[1684]: time="2026-01-23T18:31:42.908041676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69d9c854f8-4vpsg,Uid:4e218aa3-82e2-43da-97a5-0f48de07a97f,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:42.918501 containerd[1684]: time="2026-01-23T18:31:42.918405122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nqrcz,Uid:87ab8ef8-50a9-48bb-8cd1-1b2e13486180,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:42.923706 containerd[1684]: time="2026-01-23T18:31:42.923659180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-dw7bl,Uid:8cfde420-1102-4e10-b36c-f5766b852cc7,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:42.928013 containerd[1684]: time="2026-01-23T18:31:42.927902227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-548cd799bc-4tqh9,Uid:7830dd1a-252c-46ff-ba84-ba7feca691a8,Namespace:calico-apiserver,Attempt:0,}" Jan 23 18:31:43.030595 containerd[1684]: time="2026-01-23T18:31:43.030543831Z" level=error msg="Failed to destroy network for sandbox \"6b79400878c073f62042e1357e40af6289b412ee75da867a2bf1788d83595c0d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.032431 containerd[1684]: time="2026-01-23T18:31:43.032356180Z" level=error msg="Failed to destroy network for sandbox \"52e59ca449edf0bcc17b968286b6e642d04e176e879dcd79d6b14b6339858879\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.033688 containerd[1684]: time="2026-01-23T18:31:43.033656293Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-dw7bl,Uid:8cfde420-1102-4e10-b36c-f5766b852cc7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b79400878c073f62042e1357e40af6289b412ee75da867a2bf1788d83595c0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.034555 kubelet[2852]: E0123 18:31:43.034498 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b79400878c073f62042e1357e40af6289b412ee75da867a2bf1788d83595c0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.034660 kubelet[2852]: E0123 18:31:43.034584 2852 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b79400878c073f62042e1357e40af6289b412ee75da867a2bf1788d83595c0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-dw7bl" Jan 23 18:31:43.034660 kubelet[2852]: E0123 18:31:43.034606 2852 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b79400878c073f62042e1357e40af6289b412ee75da867a2bf1788d83595c0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-dw7bl" Jan 23 18:31:43.034660 kubelet[2852]: E0123 18:31:43.034648 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-dw7bl_calico-system(8cfde420-1102-4e10-b36c-f5766b852cc7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-dw7bl_calico-system(8cfde420-1102-4e10-b36c-f5766b852cc7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6b79400878c073f62042e1357e40af6289b412ee75da867a2bf1788d83595c0d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:31:43.036465 containerd[1684]: time="2026-01-23T18:31:43.036443162Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69d9c854f8-4vpsg,Uid:4e218aa3-82e2-43da-97a5-0f48de07a97f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"52e59ca449edf0bcc17b968286b6e642d04e176e879dcd79d6b14b6339858879\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.036987 kubelet[2852]: E0123 18:31:43.036939 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52e59ca449edf0bcc17b968286b6e642d04e176e879dcd79d6b14b6339858879\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.037027 kubelet[2852]: E0123 18:31:43.036985 2852 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52e59ca449edf0bcc17b968286b6e642d04e176e879dcd79d6b14b6339858879\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" Jan 23 18:31:43.037027 kubelet[2852]: E0123 18:31:43.037005 2852 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52e59ca449edf0bcc17b968286b6e642d04e176e879dcd79d6b14b6339858879\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" Jan 23 18:31:43.037065 kubelet[2852]: E0123 18:31:43.037033 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69d9c854f8-4vpsg_calico-system(4e218aa3-82e2-43da-97a5-0f48de07a97f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69d9c854f8-4vpsg_calico-system(4e218aa3-82e2-43da-97a5-0f48de07a97f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"52e59ca449edf0bcc17b968286b6e642d04e176e879dcd79d6b14b6339858879\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:31:43.056435 containerd[1684]: time="2026-01-23T18:31:43.056396292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 23 18:31:43.084720 containerd[1684]: time="2026-01-23T18:31:43.084594906Z" level=error msg="Failed to destroy network for sandbox \"028f6b6172a825e4999305aae263746e762d77be24491deee1f10cdf6bc7bc81\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.088596 containerd[1684]: time="2026-01-23T18:31:43.088566278Z" level=error msg="Failed to destroy network for sandbox \"01fbd4e67749b00a548fca56b78b15d22b01a08d7de8864d003ffac82372adaf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.088693 containerd[1684]: time="2026-01-23T18:31:43.088585448Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bcb54d7bd-b6q48,Uid:c3e274ab-7780-47cb-bdba-8ff5c1acc9ce,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"028f6b6172a825e4999305aae263746e762d77be24491deee1f10cdf6bc7bc81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.088902 kubelet[2852]: E0123 18:31:43.088871 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"028f6b6172a825e4999305aae263746e762d77be24491deee1f10cdf6bc7bc81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.089142 kubelet[2852]: E0123 18:31:43.089127 2852 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"028f6b6172a825e4999305aae263746e762d77be24491deee1f10cdf6bc7bc81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-bcb54d7bd-b6q48" Jan 23 18:31:43.089199 kubelet[2852]: E0123 18:31:43.089189 2852 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"028f6b6172a825e4999305aae263746e762d77be24491deee1f10cdf6bc7bc81\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-bcb54d7bd-b6q48" Jan 23 18:31:43.089279 kubelet[2852]: E0123 18:31:43.089262 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-bcb54d7bd-b6q48_calico-system(c3e274ab-7780-47cb-bdba-8ff5c1acc9ce)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-bcb54d7bd-b6q48_calico-system(c3e274ab-7780-47cb-bdba-8ff5c1acc9ce)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"028f6b6172a825e4999305aae263746e762d77be24491deee1f10cdf6bc7bc81\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-bcb54d7bd-b6q48" podUID="c3e274ab-7780-47cb-bdba-8ff5c1acc9ce" Jan 23 18:31:43.094531 containerd[1684]: time="2026-01-23T18:31:43.094506050Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-rtp4c,Uid:43b5afc6-15bc-4463-a623-3577c228380b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"01fbd4e67749b00a548fca56b78b15d22b01a08d7de8864d003ffac82372adaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.094796 kubelet[2852]: E0123 18:31:43.094750 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01fbd4e67749b00a548fca56b78b15d22b01a08d7de8864d003ffac82372adaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.094834 kubelet[2852]: E0123 18:31:43.094806 2852 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01fbd4e67749b00a548fca56b78b15d22b01a08d7de8864d003ffac82372adaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-rtp4c" Jan 23 18:31:43.094834 kubelet[2852]: E0123 18:31:43.094824 2852 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01fbd4e67749b00a548fca56b78b15d22b01a08d7de8864d003ffac82372adaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-rtp4c" Jan 23 18:31:43.094874 kubelet[2852]: E0123 18:31:43.094857 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-rtp4c_kube-system(43b5afc6-15bc-4463-a623-3577c228380b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-rtp4c_kube-system(43b5afc6-15bc-4463-a623-3577c228380b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"01fbd4e67749b00a548fca56b78b15d22b01a08d7de8864d003ffac82372adaf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-rtp4c" podUID="43b5afc6-15bc-4463-a623-3577c228380b" Jan 23 18:31:43.097782 containerd[1684]: time="2026-01-23T18:31:43.097764324Z" level=error msg="Failed to destroy network for sandbox \"c02ec13efbc4faa1c78763d2d2a1f9195e4e87641fa1b25372e5cf1396b125b4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.099597 containerd[1684]: time="2026-01-23T18:31:43.099576364Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-548cd799bc-rm88n,Uid:8bfceda8-a381-44e3-ab5c-8d6cd2f190e0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02ec13efbc4faa1c78763d2d2a1f9195e4e87641fa1b25372e5cf1396b125b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.099780 kubelet[2852]: E0123 18:31:43.099763 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02ec13efbc4faa1c78763d2d2a1f9195e4e87641fa1b25372e5cf1396b125b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.099867 kubelet[2852]: E0123 18:31:43.099856 2852 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02ec13efbc4faa1c78763d2d2a1f9195e4e87641fa1b25372e5cf1396b125b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" Jan 23 18:31:43.099910 kubelet[2852]: E0123 18:31:43.099901 2852 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c02ec13efbc4faa1c78763d2d2a1f9195e4e87641fa1b25372e5cf1396b125b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" Jan 23 18:31:43.100260 kubelet[2852]: E0123 18:31:43.099966 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-548cd799bc-rm88n_calico-apiserver(8bfceda8-a381-44e3-ab5c-8d6cd2f190e0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-548cd799bc-rm88n_calico-apiserver(8bfceda8-a381-44e3-ab5c-8d6cd2f190e0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c02ec13efbc4faa1c78763d2d2a1f9195e4e87641fa1b25372e5cf1396b125b4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:31:43.109596 containerd[1684]: time="2026-01-23T18:31:43.109572838Z" level=error msg="Failed to destroy network for sandbox \"257411cdf6d4cfc1542a5082663c54687aafbd0486eb6dd4b1921ee8ba878771\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.109981 containerd[1684]: time="2026-01-23T18:31:43.109934122Z" level=error msg="Failed to destroy network for sandbox \"5be76e759578d21f069b924a42688716e683a97bc56a54efdcecf35fe87f165e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.111703 containerd[1684]: time="2026-01-23T18:31:43.111650700Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nqrcz,Uid:87ab8ef8-50a9-48bb-8cd1-1b2e13486180,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"257411cdf6d4cfc1542a5082663c54687aafbd0486eb6dd4b1921ee8ba878771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.111968 kubelet[2852]: E0123 18:31:43.111884 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"257411cdf6d4cfc1542a5082663c54687aafbd0486eb6dd4b1921ee8ba878771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.111968 kubelet[2852]: E0123 18:31:43.111930 2852 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"257411cdf6d4cfc1542a5082663c54687aafbd0486eb6dd4b1921ee8ba878771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nqrcz" Jan 23 18:31:43.111968 kubelet[2852]: E0123 18:31:43.111943 2852 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"257411cdf6d4cfc1542a5082663c54687aafbd0486eb6dd4b1921ee8ba878771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-nqrcz" Jan 23 18:31:43.112487 kubelet[2852]: E0123 18:31:43.112072 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-nqrcz_kube-system(87ab8ef8-50a9-48bb-8cd1-1b2e13486180)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-nqrcz_kube-system(87ab8ef8-50a9-48bb-8cd1-1b2e13486180)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"257411cdf6d4cfc1542a5082663c54687aafbd0486eb6dd4b1921ee8ba878771\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-nqrcz" podUID="87ab8ef8-50a9-48bb-8cd1-1b2e13486180" Jan 23 18:31:43.114050 containerd[1684]: time="2026-01-23T18:31:43.113637890Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-548cd799bc-4tqh9,Uid:7830dd1a-252c-46ff-ba84-ba7feca691a8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5be76e759578d21f069b924a42688716e683a97bc56a54efdcecf35fe87f165e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.114120 kubelet[2852]: E0123 18:31:43.113778 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5be76e759578d21f069b924a42688716e683a97bc56a54efdcecf35fe87f165e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:43.114120 kubelet[2852]: E0123 18:31:43.113841 2852 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5be76e759578d21f069b924a42688716e683a97bc56a54efdcecf35fe87f165e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" Jan 23 18:31:43.114120 kubelet[2852]: E0123 18:31:43.113852 2852 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5be76e759578d21f069b924a42688716e683a97bc56a54efdcecf35fe87f165e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" Jan 23 18:31:43.114183 kubelet[2852]: E0123 18:31:43.113875 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-548cd799bc-4tqh9_calico-apiserver(7830dd1a-252c-46ff-ba84-ba7feca691a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-548cd799bc-4tqh9_calico-apiserver(7830dd1a-252c-46ff-ba84-ba7feca691a8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5be76e759578d21f069b924a42688716e683a97bc56a54efdcecf35fe87f165e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:31:43.922578 systemd[1]: Created slice kubepods-besteffort-podf61eaf28_593a_461f_8945_a34eecb93534.slice - libcontainer container kubepods-besteffort-podf61eaf28_593a_461f_8945_a34eecb93534.slice. Jan 23 18:31:43.927011 containerd[1684]: time="2026-01-23T18:31:43.926917242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6z82s,Uid:f61eaf28-593a-461f-8945-a34eecb93534,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:44.025265 containerd[1684]: time="2026-01-23T18:31:44.025163116Z" level=error msg="Failed to destroy network for sandbox \"6c489f2c40286b85e23d80b5d52d8282ecbfcfbbabc690addc657782603966cd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:44.029935 systemd[1]: run-netns-cni\x2d1de085c8\x2d5c5f\x2d97cc\x2d316d\x2d75c673c11d1f.mount: Deactivated successfully. Jan 23 18:31:44.032037 containerd[1684]: time="2026-01-23T18:31:44.031605919Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6z82s,Uid:f61eaf28-593a-461f-8945-a34eecb93534,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c489f2c40286b85e23d80b5d52d8282ecbfcfbbabc690addc657782603966cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:44.032292 kubelet[2852]: E0123 18:31:44.032223 2852 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c489f2c40286b85e23d80b5d52d8282ecbfcfbbabc690addc657782603966cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 18:31:44.033138 kubelet[2852]: E0123 18:31:44.032836 2852 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c489f2c40286b85e23d80b5d52d8282ecbfcfbbabc690addc657782603966cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6z82s" Jan 23 18:31:44.033197 kubelet[2852]: E0123 18:31:44.033134 2852 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c489f2c40286b85e23d80b5d52d8282ecbfcfbbabc690addc657782603966cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-6z82s" Jan 23 18:31:44.034969 kubelet[2852]: E0123 18:31:44.033695 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6c489f2c40286b85e23d80b5d52d8282ecbfcfbbabc690addc657782603966cd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:51.478435 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3908041061.mount: Deactivated successfully. Jan 23 18:31:51.499833 containerd[1684]: time="2026-01-23T18:31:51.499771101Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:51.501273 containerd[1684]: time="2026-01-23T18:31:51.501245801Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 23 18:31:51.502673 containerd[1684]: time="2026-01-23T18:31:51.502625739Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:51.505004 containerd[1684]: time="2026-01-23T18:31:51.504958384Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 18:31:51.505769 containerd[1684]: time="2026-01-23T18:31:51.505602378Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 8.449168936s" Jan 23 18:31:51.505769 containerd[1684]: time="2026-01-23T18:31:51.505639358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 23 18:31:51.520354 containerd[1684]: time="2026-01-23T18:31:51.520329213Z" level=info msg="CreateContainer within sandbox \"b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 23 18:31:51.540145 containerd[1684]: time="2026-01-23T18:31:51.540105340Z" level=info msg="Container 6bea94c98e20ae9b43dabd9fd51bd630227ea35c37f46e2298d4ae1be89c9a0e: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:51.541595 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount479920748.mount: Deactivated successfully. Jan 23 18:31:51.552980 containerd[1684]: time="2026-01-23T18:31:51.552926952Z" level=info msg="CreateContainer within sandbox \"b4b2b0b9cbb4b49f3c36ed0e5e4e27567e87774c763ae3f943aaa71796322d49\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"6bea94c98e20ae9b43dabd9fd51bd630227ea35c37f46e2298d4ae1be89c9a0e\"" Jan 23 18:31:51.553694 containerd[1684]: time="2026-01-23T18:31:51.553388205Z" level=info msg="StartContainer for \"6bea94c98e20ae9b43dabd9fd51bd630227ea35c37f46e2298d4ae1be89c9a0e\"" Jan 23 18:31:51.554501 containerd[1684]: time="2026-01-23T18:31:51.554486122Z" level=info msg="connecting to shim 6bea94c98e20ae9b43dabd9fd51bd630227ea35c37f46e2298d4ae1be89c9a0e" address="unix:///run/containerd/s/8a1760322a4f03d96cf542910c0595ae697dee76f40b258d936a36060bdc7ddb" protocol=ttrpc version=3 Jan 23 18:31:51.597106 systemd[1]: Started cri-containerd-6bea94c98e20ae9b43dabd9fd51bd630227ea35c37f46e2298d4ae1be89c9a0e.scope - libcontainer container 6bea94c98e20ae9b43dabd9fd51bd630227ea35c37f46e2298d4ae1be89c9a0e. Jan 23 18:31:51.663000 audit: BPF prog-id=172 op=LOAD Jan 23 18:31:51.665605 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 23 18:31:51.665675 kernel: audit: type=1334 audit(1769193111.663:587): prog-id=172 op=LOAD Jan 23 18:31:51.663000 audit[3914]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001fa488 a2=98 a3=0 items=0 ppid=3394 pid=3914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:51.680049 kernel: audit: type=1300 audit(1769193111.663:587): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001fa488 a2=98 a3=0 items=0 ppid=3394 pid=3914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:51.663000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662656139346339386532306165396234336461626439666435316264 Jan 23 18:31:51.694206 kernel: audit: type=1327 audit(1769193111.663:587): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662656139346339386532306165396234336461626439666435316264 Jan 23 18:31:51.665000 audit: BPF prog-id=173 op=LOAD Jan 23 18:31:51.706053 kernel: audit: type=1334 audit(1769193111.665:588): prog-id=173 op=LOAD Jan 23 18:31:51.665000 audit[3914]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001fa218 a2=98 a3=0 items=0 ppid=3394 pid=3914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:51.712093 kernel: audit: type=1300 audit(1769193111.665:588): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001fa218 a2=98 a3=0 items=0 ppid=3394 pid=3914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:51.665000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662656139346339386532306165396234336461626439666435316264 Jan 23 18:31:51.727270 kernel: audit: type=1327 audit(1769193111.665:588): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662656139346339386532306165396234336461626439666435316264 Jan 23 18:31:51.665000 audit: BPF prog-id=173 op=UNLOAD Jan 23 18:31:51.742715 kernel: audit: type=1334 audit(1769193111.665:589): prog-id=173 op=UNLOAD Jan 23 18:31:51.665000 audit[3914]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:51.750195 kernel: audit: type=1300 audit(1769193111.665:589): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:51.665000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662656139346339386532306165396234336461626439666435316264 Jan 23 18:31:51.777256 containerd[1684]: time="2026-01-23T18:31:51.777123030Z" level=info msg="StartContainer for \"6bea94c98e20ae9b43dabd9fd51bd630227ea35c37f46e2298d4ae1be89c9a0e\" returns successfully" Jan 23 18:31:51.665000 audit: BPF prog-id=172 op=UNLOAD Jan 23 18:31:51.782960 kernel: audit: type=1327 audit(1769193111.665:589): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662656139346339386532306165396234336461626439666435316264 Jan 23 18:31:51.784067 kernel: audit: type=1334 audit(1769193111.665:590): prog-id=172 op=UNLOAD Jan 23 18:31:51.665000 audit[3914]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3394 pid=3914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:51.665000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662656139346339386532306165396234336461626439666435316264 Jan 23 18:31:51.665000 audit: BPF prog-id=174 op=LOAD Jan 23 18:31:51.665000 audit[3914]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001fa6e8 a2=98 a3=0 items=0 ppid=3394 pid=3914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:51.665000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662656139346339386532306165396234336461626439666435316264 Jan 23 18:31:51.917309 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 23 18:31:51.917437 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 23 18:31:52.125693 kubelet[2852]: I0123 18:31:52.125656 2852 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-whisker-backend-key-pair\") pod \"c3e274ab-7780-47cb-bdba-8ff5c1acc9ce\" (UID: \"c3e274ab-7780-47cb-bdba-8ff5c1acc9ce\") " Jan 23 18:31:52.125693 kubelet[2852]: I0123 18:31:52.125696 2852 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvxc8\" (UniqueName: \"kubernetes.io/projected/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-kube-api-access-rvxc8\") pod \"c3e274ab-7780-47cb-bdba-8ff5c1acc9ce\" (UID: \"c3e274ab-7780-47cb-bdba-8ff5c1acc9ce\") " Jan 23 18:31:52.126437 kubelet[2852]: I0123 18:31:52.125713 2852 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-whisker-ca-bundle\") pod \"c3e274ab-7780-47cb-bdba-8ff5c1acc9ce\" (UID: \"c3e274ab-7780-47cb-bdba-8ff5c1acc9ce\") " Jan 23 18:31:52.126872 kubelet[2852]: I0123 18:31:52.126850 2852 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "c3e274ab-7780-47cb-bdba-8ff5c1acc9ce" (UID: "c3e274ab-7780-47cb-bdba-8ff5c1acc9ce"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 23 18:31:52.134952 kubelet[2852]: I0123 18:31:52.134876 2852 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-kube-api-access-rvxc8" (OuterVolumeSpecName: "kube-api-access-rvxc8") pod "c3e274ab-7780-47cb-bdba-8ff5c1acc9ce" (UID: "c3e274ab-7780-47cb-bdba-8ff5c1acc9ce"). InnerVolumeSpecName "kube-api-access-rvxc8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 23 18:31:52.135023 kubelet[2852]: I0123 18:31:52.134955 2852 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "c3e274ab-7780-47cb-bdba-8ff5c1acc9ce" (UID: "c3e274ab-7780-47cb-bdba-8ff5c1acc9ce"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 23 18:31:52.227005 kubelet[2852]: I0123 18:31:52.226756 2852 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-whisker-backend-key-pair\") on node \"ci-4547-1-0-c-e2d32aff86\" DevicePath \"\"" Jan 23 18:31:52.227005 kubelet[2852]: I0123 18:31:52.226782 2852 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rvxc8\" (UniqueName: \"kubernetes.io/projected/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-kube-api-access-rvxc8\") on node \"ci-4547-1-0-c-e2d32aff86\" DevicePath \"\"" Jan 23 18:31:52.227005 kubelet[2852]: I0123 18:31:52.226789 2852 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce-whisker-ca-bundle\") on node \"ci-4547-1-0-c-e2d32aff86\" DevicePath \"\"" Jan 23 18:31:52.384931 systemd[1]: Removed slice kubepods-besteffort-podc3e274ab_7780_47cb_bdba_8ff5c1acc9ce.slice - libcontainer container kubepods-besteffort-podc3e274ab_7780_47cb_bdba_8ff5c1acc9ce.slice. Jan 23 18:31:52.396091 kubelet[2852]: I0123 18:31:52.396037 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-bk2m5" podStartSLOduration=1.7220735 podStartE2EDuration="22.396023041s" podCreationTimestamp="2026-01-23 18:31:30 +0000 UTC" firstStartedPulling="2026-01-23 18:31:30.832428893 +0000 UTC m=+21.025462833" lastFinishedPulling="2026-01-23 18:31:51.506378434 +0000 UTC m=+41.699412374" observedRunningTime="2026-01-23 18:31:52.100947269 +0000 UTC m=+42.293981209" watchObservedRunningTime="2026-01-23 18:31:52.396023041 +0000 UTC m=+42.589056981" Jan 23 18:31:52.437992 systemd[1]: Created slice kubepods-besteffort-pode698d847_3c19_47a7_986c_5552a2964f3f.slice - libcontainer container kubepods-besteffort-pode698d847_3c19_47a7_986c_5552a2964f3f.slice. Jan 23 18:31:52.480137 systemd[1]: var-lib-kubelet-pods-c3e274ab\x2d7780\x2d47cb\x2dbdba\x2d8ff5c1acc9ce-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2drvxc8.mount: Deactivated successfully. Jan 23 18:31:52.480498 systemd[1]: var-lib-kubelet-pods-c3e274ab\x2d7780\x2d47cb\x2dbdba\x2d8ff5c1acc9ce-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 23 18:31:52.529341 kubelet[2852]: I0123 18:31:52.529271 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e698d847-3c19-47a7-986c-5552a2964f3f-whisker-backend-key-pair\") pod \"whisker-7b5b4545c8-4tbgk\" (UID: \"e698d847-3c19-47a7-986c-5552a2964f3f\") " pod="calico-system/whisker-7b5b4545c8-4tbgk" Jan 23 18:31:52.529341 kubelet[2852]: I0123 18:31:52.529318 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e698d847-3c19-47a7-986c-5552a2964f3f-whisker-ca-bundle\") pod \"whisker-7b5b4545c8-4tbgk\" (UID: \"e698d847-3c19-47a7-986c-5552a2964f3f\") " pod="calico-system/whisker-7b5b4545c8-4tbgk" Jan 23 18:31:52.529341 kubelet[2852]: I0123 18:31:52.529410 2852 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzrrg\" (UniqueName: \"kubernetes.io/projected/e698d847-3c19-47a7-986c-5552a2964f3f-kube-api-access-lzrrg\") pod \"whisker-7b5b4545c8-4tbgk\" (UID: \"e698d847-3c19-47a7-986c-5552a2964f3f\") " pod="calico-system/whisker-7b5b4545c8-4tbgk" Jan 23 18:31:52.745073 containerd[1684]: time="2026-01-23T18:31:52.744857028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b5b4545c8-4tbgk,Uid:e698d847-3c19-47a7-986c-5552a2964f3f,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:52.968463 systemd-networkd[1578]: calidaa8b71324f: Link UP Jan 23 18:31:52.969942 systemd-networkd[1578]: calidaa8b71324f: Gained carrier Jan 23 18:31:52.996801 containerd[1684]: 2026-01-23 18:31:52.796 [INFO][4004] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 23 18:31:52.996801 containerd[1684]: 2026-01-23 18:31:52.859 [INFO][4004] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0 whisker-7b5b4545c8- calico-system e698d847-3c19-47a7-986c-5552a2964f3f 925 0 2026-01-23 18:31:52 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7b5b4545c8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-1-0-c-e2d32aff86 whisker-7b5b4545c8-4tbgk eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calidaa8b71324f [] [] }} ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Namespace="calico-system" Pod="whisker-7b5b4545c8-4tbgk" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-" Jan 23 18:31:52.996801 containerd[1684]: 2026-01-23 18:31:52.859 [INFO][4004] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Namespace="calico-system" Pod="whisker-7b5b4545c8-4tbgk" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" Jan 23 18:31:52.996801 containerd[1684]: 2026-01-23 18:31:52.900 [INFO][4016] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" HandleID="k8s-pod-network.d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Workload="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" Jan 23 18:31:52.997198 containerd[1684]: 2026-01-23 18:31:52.900 [INFO][4016] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" HandleID="k8s-pod-network.d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Workload="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5be0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-c-e2d32aff86", "pod":"whisker-7b5b4545c8-4tbgk", "timestamp":"2026-01-23 18:31:52.900402957 +0000 UTC"}, Hostname:"ci-4547-1-0-c-e2d32aff86", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:52.997198 containerd[1684]: 2026-01-23 18:31:52.900 [INFO][4016] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:52.997198 containerd[1684]: 2026-01-23 18:31:52.900 [INFO][4016] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:52.997198 containerd[1684]: 2026-01-23 18:31:52.900 [INFO][4016] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-c-e2d32aff86' Jan 23 18:31:52.997198 containerd[1684]: 2026-01-23 18:31:52.911 [INFO][4016] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:52.997198 containerd[1684]: 2026-01-23 18:31:52.918 [INFO][4016] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:52.997198 containerd[1684]: 2026-01-23 18:31:52.923 [INFO][4016] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:52.997198 containerd[1684]: 2026-01-23 18:31:52.925 [INFO][4016] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:52.997198 containerd[1684]: 2026-01-23 18:31:52.928 [INFO][4016] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:52.997549 containerd[1684]: 2026-01-23 18:31:52.928 [INFO][4016] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:52.997549 containerd[1684]: 2026-01-23 18:31:52.930 [INFO][4016] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61 Jan 23 18:31:52.997549 containerd[1684]: 2026-01-23 18:31:52.937 [INFO][4016] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:52.997549 containerd[1684]: 2026-01-23 18:31:52.943 [INFO][4016] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.65/26] block=192.168.100.64/26 handle="k8s-pod-network.d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:52.997549 containerd[1684]: 2026-01-23 18:31:52.943 [INFO][4016] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.65/26] handle="k8s-pod-network.d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:52.997549 containerd[1684]: 2026-01-23 18:31:52.944 [INFO][4016] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:52.997549 containerd[1684]: 2026-01-23 18:31:52.944 [INFO][4016] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.65/26] IPv6=[] ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" HandleID="k8s-pod-network.d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Workload="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" Jan 23 18:31:52.997789 containerd[1684]: 2026-01-23 18:31:52.951 [INFO][4004] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Namespace="calico-system" Pod="whisker-7b5b4545c8-4tbgk" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0", GenerateName:"whisker-7b5b4545c8-", Namespace:"calico-system", SelfLink:"", UID:"e698d847-3c19-47a7-986c-5552a2964f3f", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b5b4545c8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"", Pod:"whisker-7b5b4545c8-4tbgk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.100.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calidaa8b71324f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:52.997789 containerd[1684]: 2026-01-23 18:31:52.952 [INFO][4004] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.65/32] ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Namespace="calico-system" Pod="whisker-7b5b4545c8-4tbgk" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" Jan 23 18:31:52.997921 containerd[1684]: 2026-01-23 18:31:52.952 [INFO][4004] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidaa8b71324f ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Namespace="calico-system" Pod="whisker-7b5b4545c8-4tbgk" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" Jan 23 18:31:52.997921 containerd[1684]: 2026-01-23 18:31:52.969 [INFO][4004] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Namespace="calico-system" Pod="whisker-7b5b4545c8-4tbgk" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" Jan 23 18:31:52.998050 containerd[1684]: 2026-01-23 18:31:52.970 [INFO][4004] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Namespace="calico-system" Pod="whisker-7b5b4545c8-4tbgk" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0", GenerateName:"whisker-7b5b4545c8-", Namespace:"calico-system", SelfLink:"", UID:"e698d847-3c19-47a7-986c-5552a2964f3f", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7b5b4545c8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61", Pod:"whisker-7b5b4545c8-4tbgk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.100.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calidaa8b71324f", MAC:"7a:5d:fc:b6:75:38", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:52.998141 containerd[1684]: 2026-01-23 18:31:52.992 [INFO][4004] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" Namespace="calico-system" Pod="whisker-7b5b4545c8-4tbgk" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-whisker--7b5b4545c8--4tbgk-eth0" Jan 23 18:31:53.067409 containerd[1684]: time="2026-01-23T18:31:53.067240321Z" level=info msg="connecting to shim d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61" address="unix:///run/containerd/s/09732d668013a23bbd83bdda5ad4bf098addc08d5951a1ac2f7d409e3ce533b0" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:53.121734 systemd[1]: Started cri-containerd-d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61.scope - libcontainer container d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61. Jan 23 18:31:53.152000 audit: BPF prog-id=175 op=LOAD Jan 23 18:31:53.153000 audit: BPF prog-id=176 op=LOAD Jan 23 18:31:53.153000 audit[4050]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186238 a2=98 a3=0 items=0 ppid=4037 pid=4050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.153000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431633566303835323734616531326636396233386233633561356133 Jan 23 18:31:53.153000 audit: BPF prog-id=176 op=UNLOAD Jan 23 18:31:53.153000 audit[4050]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4037 pid=4050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.153000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431633566303835323734616531326636396233386233633561356133 Jan 23 18:31:53.154000 audit: BPF prog-id=177 op=LOAD Jan 23 18:31:53.154000 audit[4050]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000186488 a2=98 a3=0 items=0 ppid=4037 pid=4050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431633566303835323734616531326636396233386233633561356133 Jan 23 18:31:53.154000 audit: BPF prog-id=178 op=LOAD Jan 23 18:31:53.154000 audit[4050]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000186218 a2=98 a3=0 items=0 ppid=4037 pid=4050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431633566303835323734616531326636396233386233633561356133 Jan 23 18:31:53.154000 audit: BPF prog-id=178 op=UNLOAD Jan 23 18:31:53.154000 audit[4050]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4037 pid=4050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431633566303835323734616531326636396233386233633561356133 Jan 23 18:31:53.154000 audit: BPF prog-id=177 op=UNLOAD Jan 23 18:31:53.154000 audit[4050]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4037 pid=4050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431633566303835323734616531326636396233386233633561356133 Jan 23 18:31:53.154000 audit: BPF prog-id=179 op=LOAD Jan 23 18:31:53.154000 audit[4050]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001866e8 a2=98 a3=0 items=0 ppid=4037 pid=4050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.154000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431633566303835323734616531326636396233386233633561356133 Jan 23 18:31:53.216952 containerd[1684]: time="2026-01-23T18:31:53.216878472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b5b4545c8-4tbgk,Uid:e698d847-3c19-47a7-986c-5552a2964f3f,Namespace:calico-system,Attempt:0,} returns sandbox id \"d1c5f085274ae12f69b38b3c5a5a3b06bd78901f04f017d77d8526059e2f0a61\"" Jan 23 18:31:53.218548 containerd[1684]: time="2026-01-23T18:31:53.218370181Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 18:31:53.628000 audit: BPF prog-id=180 op=LOAD Jan 23 18:31:53.628000 audit[4219]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff278c68e0 a2=98 a3=1fffffffffffffff items=0 ppid=4111 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.628000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:53.628000 audit: BPF prog-id=180 op=UNLOAD Jan 23 18:31:53.628000 audit[4219]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff278c68b0 a3=0 items=0 ppid=4111 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.628000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:53.628000 audit: BPF prog-id=181 op=LOAD Jan 23 18:31:53.628000 audit[4219]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff278c67c0 a2=94 a3=3 items=0 ppid=4111 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.628000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:53.628000 audit: BPF prog-id=181 op=UNLOAD Jan 23 18:31:53.628000 audit[4219]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff278c67c0 a2=94 a3=3 items=0 ppid=4111 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.628000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:53.628000 audit: BPF prog-id=182 op=LOAD Jan 23 18:31:53.628000 audit[4219]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff278c6800 a2=94 a3=7fff278c69e0 items=0 ppid=4111 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.628000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:53.628000 audit: BPF prog-id=182 op=UNLOAD Jan 23 18:31:53.628000 audit[4219]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff278c6800 a2=94 a3=7fff278c69e0 items=0 ppid=4111 pid=4219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.628000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 18:31:53.629000 audit: BPF prog-id=183 op=LOAD Jan 23 18:31:53.629000 audit[4220]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd3c2358c0 a2=98 a3=3 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.629000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.629000 audit: BPF prog-id=183 op=UNLOAD Jan 23 18:31:53.629000 audit[4220]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd3c235890 a3=0 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.629000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.629000 audit: BPF prog-id=184 op=LOAD Jan 23 18:31:53.629000 audit[4220]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd3c2356b0 a2=94 a3=54428f items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.629000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.629000 audit: BPF prog-id=184 op=UNLOAD Jan 23 18:31:53.629000 audit[4220]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd3c2356b0 a2=94 a3=54428f items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.629000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.629000 audit: BPF prog-id=185 op=LOAD Jan 23 18:31:53.629000 audit[4220]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd3c2356e0 a2=94 a3=2 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.629000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.629000 audit: BPF prog-id=185 op=UNLOAD Jan 23 18:31:53.629000 audit[4220]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd3c2356e0 a2=0 a3=2 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.629000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.665081 containerd[1684]: time="2026-01-23T18:31:53.665047351Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:53.666711 containerd[1684]: time="2026-01-23T18:31:53.666688101Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 18:31:53.666786 containerd[1684]: time="2026-01-23T18:31:53.666769571Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:53.667065 kubelet[2852]: E0123 18:31:53.666920 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:31:53.667065 kubelet[2852]: E0123 18:31:53.666961 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:31:53.671202 kubelet[2852]: E0123 18:31:53.671177 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8ad8e04c94d744beb4c48881ae67d8bd,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lzrrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7b5b4545c8-4tbgk_calico-system(e698d847-3c19-47a7-986c-5552a2964f3f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:53.673665 containerd[1684]: time="2026-01-23T18:31:53.673630369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 18:31:53.761000 audit: BPF prog-id=186 op=LOAD Jan 23 18:31:53.761000 audit[4220]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd3c2355a0 a2=94 a3=1 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.761000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.761000 audit: BPF prog-id=186 op=UNLOAD Jan 23 18:31:53.761000 audit[4220]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd3c2355a0 a2=94 a3=1 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.761000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.768000 audit: BPF prog-id=187 op=LOAD Jan 23 18:31:53.768000 audit[4220]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd3c235590 a2=94 a3=4 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.768000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.768000 audit: BPF prog-id=187 op=UNLOAD Jan 23 18:31:53.768000 audit[4220]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd3c235590 a2=0 a3=4 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.768000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.769000 audit: BPF prog-id=188 op=LOAD Jan 23 18:31:53.769000 audit[4220]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd3c2353f0 a2=94 a3=5 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.769000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.769000 audit: BPF prog-id=188 op=UNLOAD Jan 23 18:31:53.769000 audit[4220]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd3c2353f0 a2=0 a3=5 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.769000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.769000 audit: BPF prog-id=189 op=LOAD Jan 23 18:31:53.769000 audit[4220]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd3c235610 a2=94 a3=6 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.769000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.769000 audit: BPF prog-id=189 op=UNLOAD Jan 23 18:31:53.769000 audit[4220]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd3c235610 a2=0 a3=6 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.769000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.769000 audit: BPF prog-id=190 op=LOAD Jan 23 18:31:53.769000 audit[4220]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd3c234dc0 a2=94 a3=88 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.769000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.769000 audit: BPF prog-id=191 op=LOAD Jan 23 18:31:53.769000 audit[4220]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffd3c234c40 a2=94 a3=2 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.769000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.769000 audit: BPF prog-id=191 op=UNLOAD Jan 23 18:31:53.769000 audit[4220]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffd3c234c70 a2=0 a3=7ffd3c234d70 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.769000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.770000 audit: BPF prog-id=190 op=UNLOAD Jan 23 18:31:53.770000 audit[4220]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=14968d10 a2=0 a3=d1d4e51c9e1d57c2 items=0 ppid=4111 pid=4220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.770000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 18:31:53.779000 audit: BPF prog-id=192 op=LOAD Jan 23 18:31:53.779000 audit[4223]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffdb96f0e0 a2=98 a3=1999999999999999 items=0 ppid=4111 pid=4223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.779000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:53.779000 audit: BPF prog-id=192 op=UNLOAD Jan 23 18:31:53.779000 audit[4223]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fffdb96f0b0 a3=0 items=0 ppid=4111 pid=4223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.779000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:53.779000 audit: BPF prog-id=193 op=LOAD Jan 23 18:31:53.779000 audit[4223]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffdb96efc0 a2=94 a3=ffff items=0 ppid=4111 pid=4223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.779000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:53.779000 audit: BPF prog-id=193 op=UNLOAD Jan 23 18:31:53.779000 audit[4223]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffdb96efc0 a2=94 a3=ffff items=0 ppid=4111 pid=4223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.779000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:53.779000 audit: BPF prog-id=194 op=LOAD Jan 23 18:31:53.779000 audit[4223]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffdb96f000 a2=94 a3=7fffdb96f1e0 items=0 ppid=4111 pid=4223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.779000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:53.779000 audit: BPF prog-id=194 op=UNLOAD Jan 23 18:31:53.779000 audit[4223]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fffdb96f000 a2=94 a3=7fffdb96f1e0 items=0 ppid=4111 pid=4223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.779000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 18:31:53.836693 systemd-networkd[1578]: vxlan.calico: Link UP Jan 23 18:31:53.836700 systemd-networkd[1578]: vxlan.calico: Gained carrier Jan 23 18:31:53.877000 audit: BPF prog-id=195 op=LOAD Jan 23 18:31:53.877000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff8723fe10 a2=98 a3=0 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.877000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.877000 audit: BPF prog-id=195 op=UNLOAD Jan 23 18:31:53.877000 audit[4248]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff8723fde0 a3=0 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.877000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.877000 audit: BPF prog-id=196 op=LOAD Jan 23 18:31:53.877000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff8723fc20 a2=94 a3=54428f items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.877000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.877000 audit: BPF prog-id=196 op=UNLOAD Jan 23 18:31:53.877000 audit[4248]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff8723fc20 a2=94 a3=54428f items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.877000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.877000 audit: BPF prog-id=197 op=LOAD Jan 23 18:31:53.877000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff8723fc50 a2=94 a3=2 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.877000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.877000 audit: BPF prog-id=197 op=UNLOAD Jan 23 18:31:53.877000 audit[4248]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff8723fc50 a2=0 a3=2 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.877000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.877000 audit: BPF prog-id=198 op=LOAD Jan 23 18:31:53.877000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff8723fa00 a2=94 a3=4 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.877000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.877000 audit: BPF prog-id=198 op=UNLOAD Jan 23 18:31:53.877000 audit[4248]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff8723fa00 a2=94 a3=4 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.877000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.877000 audit: BPF prog-id=199 op=LOAD Jan 23 18:31:53.877000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff8723fb00 a2=94 a3=7fff8723fc80 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.877000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.878000 audit: BPF prog-id=199 op=UNLOAD Jan 23 18:31:53.878000 audit[4248]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff8723fb00 a2=0 a3=7fff8723fc80 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.878000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.878000 audit: BPF prog-id=200 op=LOAD Jan 23 18:31:53.878000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff8723f230 a2=94 a3=2 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.878000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.878000 audit: BPF prog-id=200 op=UNLOAD Jan 23 18:31:53.878000 audit[4248]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff8723f230 a2=0 a3=2 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.878000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.878000 audit: BPF prog-id=201 op=LOAD Jan 23 18:31:53.878000 audit[4248]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff8723f330 a2=94 a3=30 items=0 ppid=4111 pid=4248 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.878000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 18:31:53.884000 audit: BPF prog-id=202 op=LOAD Jan 23 18:31:53.884000 audit[4251]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd5bd00180 a2=98 a3=0 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.884000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:53.884000 audit: BPF prog-id=202 op=UNLOAD Jan 23 18:31:53.884000 audit[4251]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffd5bd00150 a3=0 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.884000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:53.885000 audit: BPF prog-id=203 op=LOAD Jan 23 18:31:53.885000 audit[4251]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd5bcfff70 a2=94 a3=54428f items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.885000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:53.885000 audit: BPF prog-id=203 op=UNLOAD Jan 23 18:31:53.885000 audit[4251]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd5bcfff70 a2=94 a3=54428f items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.885000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:53.885000 audit: BPF prog-id=204 op=LOAD Jan 23 18:31:53.885000 audit[4251]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd5bcfffa0 a2=94 a3=2 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.885000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:53.885000 audit: BPF prog-id=204 op=UNLOAD Jan 23 18:31:53.885000 audit[4251]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd5bcfffa0 a2=0 a3=2 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:53.885000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:53.912410 kubelet[2852]: I0123 18:31:53.912260 2852 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3e274ab-7780-47cb-bdba-8ff5c1acc9ce" path="/var/lib/kubelet/pods/c3e274ab-7780-47cb-bdba-8ff5c1acc9ce/volumes" Jan 23 18:31:54.021000 audit: BPF prog-id=205 op=LOAD Jan 23 18:31:54.021000 audit[4251]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffd5bcffe60 a2=94 a3=1 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.021000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.021000 audit: BPF prog-id=205 op=UNLOAD Jan 23 18:31:54.021000 audit[4251]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffd5bcffe60 a2=94 a3=1 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.021000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.028000 audit: BPF prog-id=206 op=LOAD Jan 23 18:31:54.028000 audit[4251]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd5bcffe50 a2=94 a3=4 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.028000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.028000 audit: BPF prog-id=206 op=UNLOAD Jan 23 18:31:54.028000 audit[4251]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd5bcffe50 a2=0 a3=4 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.028000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.028000 audit: BPF prog-id=207 op=LOAD Jan 23 18:31:54.028000 audit[4251]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd5bcffcb0 a2=94 a3=5 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.028000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.028000 audit: BPF prog-id=207 op=UNLOAD Jan 23 18:31:54.028000 audit[4251]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffd5bcffcb0 a2=0 a3=5 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.028000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.028000 audit: BPF prog-id=208 op=LOAD Jan 23 18:31:54.028000 audit[4251]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd5bcffed0 a2=94 a3=6 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.028000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.028000 audit: BPF prog-id=208 op=UNLOAD Jan 23 18:31:54.028000 audit[4251]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffd5bcffed0 a2=0 a3=6 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.028000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.029000 audit: BPF prog-id=209 op=LOAD Jan 23 18:31:54.029000 audit[4251]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffd5bcff680 a2=94 a3=88 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.029000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.029000 audit: BPF prog-id=210 op=LOAD Jan 23 18:31:54.029000 audit[4251]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffd5bcff500 a2=94 a3=2 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.029000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.029000 audit: BPF prog-id=210 op=UNLOAD Jan 23 18:31:54.029000 audit[4251]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffd5bcff530 a2=0 a3=7ffd5bcff630 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.029000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.029000 audit: BPF prog-id=209 op=UNLOAD Jan 23 18:31:54.029000 audit[4251]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=1b80ed10 a2=0 a3=b56e692c7e741cc2 items=0 ppid=4111 pid=4251 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.029000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 18:31:54.038000 audit: BPF prog-id=201 op=UNLOAD Jan 23 18:31:54.038000 audit[4111]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c0007b81c0 a2=0 a3=0 items=0 ppid=4101 pid=4111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.038000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 23 18:31:54.098000 audit[4280]: NETFILTER_CFG table=nat:119 family=2 entries=15 op=nft_register_chain pid=4280 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:54.098000 audit[4280]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffe33540250 a2=0 a3=7ffe3354023c items=0 ppid=4111 pid=4280 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.098000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:54.100000 audit[4278]: NETFILTER_CFG table=mangle:120 family=2 entries=16 op=nft_register_chain pid=4278 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:54.100000 audit[4278]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffca7358760 a2=0 a3=7ffca735874c items=0 ppid=4111 pid=4278 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.100000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:54.107000 audit[4277]: NETFILTER_CFG table=raw:121 family=2 entries=21 op=nft_register_chain pid=4277 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:54.107000 audit[4277]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffe551160a0 a2=0 a3=7ffe5511608c items=0 ppid=4111 pid=4277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.107000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:54.127006 containerd[1684]: time="2026-01-23T18:31:54.126833696Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:54.128081 containerd[1684]: time="2026-01-23T18:31:54.128042743Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 18:31:54.128182 containerd[1684]: time="2026-01-23T18:31:54.128168703Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:54.128529 kubelet[2852]: E0123 18:31:54.128463 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:31:54.128529 kubelet[2852]: E0123 18:31:54.128514 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:31:54.128798 kubelet[2852]: E0123 18:31:54.128711 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzrrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7b5b4545c8-4tbgk_calico-system(e698d847-3c19-47a7-986c-5552a2964f3f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:54.130117 kubelet[2852]: E0123 18:31:54.129951 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:31:54.109000 audit[4282]: NETFILTER_CFG table=filter:122 family=2 entries=94 op=nft_register_chain pid=4282 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:54.109000 audit[4282]: SYSCALL arch=c000003e syscall=46 success=yes exit=53116 a0=3 a1=7ffd7333b670 a2=0 a3=7ffd7333b65c items=0 ppid=4111 pid=4282 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:54.109000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:54.766119 systemd-networkd[1578]: calidaa8b71324f: Gained IPv6LL Jan 23 18:31:54.911595 containerd[1684]: time="2026-01-23T18:31:54.911545990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-548cd799bc-rm88n,Uid:8bfceda8-a381-44e3-ab5c-8d6cd2f190e0,Namespace:calico-apiserver,Attempt:0,}" Jan 23 18:31:54.911902 containerd[1684]: time="2026-01-23T18:31:54.911867411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nqrcz,Uid:87ab8ef8-50a9-48bb-8cd1-1b2e13486180,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:55.100053 kubelet[2852]: E0123 18:31:55.099833 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:31:55.118870 systemd-networkd[1578]: cali0b80eb4ba88: Link UP Jan 23 18:31:55.122283 systemd-networkd[1578]: cali0b80eb4ba88: Gained carrier Jan 23 18:31:55.144533 containerd[1684]: 2026-01-23 18:31:54.992 [INFO][4294] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0 calico-apiserver-548cd799bc- calico-apiserver 8bfceda8-a381-44e3-ab5c-8d6cd2f190e0 846 0 2026-01-23 18:31:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:548cd799bc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-1-0-c-e2d32aff86 calico-apiserver-548cd799bc-rm88n eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0b80eb4ba88 [] [] }} ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-rm88n" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-" Jan 23 18:31:55.144533 containerd[1684]: 2026-01-23 18:31:54.993 [INFO][4294] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-rm88n" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" Jan 23 18:31:55.144533 containerd[1684]: 2026-01-23 18:31:55.049 [INFO][4316] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" HandleID="k8s-pod-network.034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Workload="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" Jan 23 18:31:55.144900 containerd[1684]: 2026-01-23 18:31:55.049 [INFO][4316] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" HandleID="k8s-pod-network.034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Workload="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5800), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-1-0-c-e2d32aff86", "pod":"calico-apiserver-548cd799bc-rm88n", "timestamp":"2026-01-23 18:31:55.049546963 +0000 UTC"}, Hostname:"ci-4547-1-0-c-e2d32aff86", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:55.144900 containerd[1684]: 2026-01-23 18:31:55.049 [INFO][4316] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:55.144900 containerd[1684]: 2026-01-23 18:31:55.049 [INFO][4316] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:55.144900 containerd[1684]: 2026-01-23 18:31:55.049 [INFO][4316] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-c-e2d32aff86' Jan 23 18:31:55.144900 containerd[1684]: 2026-01-23 18:31:55.058 [INFO][4316] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.144900 containerd[1684]: 2026-01-23 18:31:55.063 [INFO][4316] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.144900 containerd[1684]: 2026-01-23 18:31:55.069 [INFO][4316] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.144900 containerd[1684]: 2026-01-23 18:31:55.071 [INFO][4316] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.144900 containerd[1684]: 2026-01-23 18:31:55.074 [INFO][4316] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.145094 containerd[1684]: 2026-01-23 18:31:55.074 [INFO][4316] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.145094 containerd[1684]: 2026-01-23 18:31:55.076 [INFO][4316] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907 Jan 23 18:31:55.145094 containerd[1684]: 2026-01-23 18:31:55.081 [INFO][4316] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.145094 containerd[1684]: 2026-01-23 18:31:55.088 [INFO][4316] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.66/26] block=192.168.100.64/26 handle="k8s-pod-network.034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.145094 containerd[1684]: 2026-01-23 18:31:55.088 [INFO][4316] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.66/26] handle="k8s-pod-network.034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.145094 containerd[1684]: 2026-01-23 18:31:55.088 [INFO][4316] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:55.145094 containerd[1684]: 2026-01-23 18:31:55.089 [INFO][4316] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.66/26] IPv6=[] ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" HandleID="k8s-pod-network.034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Workload="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" Jan 23 18:31:55.145214 containerd[1684]: 2026-01-23 18:31:55.099 [INFO][4294] cni-plugin/k8s.go 418: Populated endpoint ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-rm88n" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0", GenerateName:"calico-apiserver-548cd799bc-", Namespace:"calico-apiserver", SelfLink:"", UID:"8bfceda8-a381-44e3-ab5c-8d6cd2f190e0", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"548cd799bc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"", Pod:"calico-apiserver-548cd799bc-rm88n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0b80eb4ba88", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:55.145257 containerd[1684]: 2026-01-23 18:31:55.100 [INFO][4294] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.66/32] ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-rm88n" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" Jan 23 18:31:55.145257 containerd[1684]: 2026-01-23 18:31:55.102 [INFO][4294] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0b80eb4ba88 ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-rm88n" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" Jan 23 18:31:55.145257 containerd[1684]: 2026-01-23 18:31:55.124 [INFO][4294] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-rm88n" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" Jan 23 18:31:55.145299 containerd[1684]: 2026-01-23 18:31:55.127 [INFO][4294] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-rm88n" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0", GenerateName:"calico-apiserver-548cd799bc-", Namespace:"calico-apiserver", SelfLink:"", UID:"8bfceda8-a381-44e3-ab5c-8d6cd2f190e0", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"548cd799bc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907", Pod:"calico-apiserver-548cd799bc-rm88n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0b80eb4ba88", MAC:"7a:b8:a8:2f:a0:a6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:55.145339 containerd[1684]: 2026-01-23 18:31:55.137 [INFO][4294] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-rm88n" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--rm88n-eth0" Jan 23 18:31:55.151000 audit[4340]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=4340 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:55.151000 audit[4340]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcd67f2d90 a2=0 a3=7ffcd67f2d7c items=0 ppid=2960 pid=4340 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.151000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:55.157000 audit[4340]: NETFILTER_CFG table=nat:124 family=2 entries=14 op=nft_register_rule pid=4340 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:55.157000 audit[4340]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffcd67f2d90 a2=0 a3=0 items=0 ppid=2960 pid=4340 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.157000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:55.170000 audit[4342]: NETFILTER_CFG table=filter:125 family=2 entries=50 op=nft_register_chain pid=4342 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:55.170000 audit[4342]: SYSCALL arch=c000003e syscall=46 success=yes exit=28208 a0=3 a1=7ffc31bfb750 a2=0 a3=7ffc31bfb73c items=0 ppid=4111 pid=4342 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.170000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:55.174008 containerd[1684]: time="2026-01-23T18:31:55.173944632Z" level=info msg="connecting to shim 034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907" address="unix:///run/containerd/s/989488b94d94e35424ad6c92cceec1263e67f80d625b82ac49b9bc7357bcc5a4" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:55.199121 systemd[1]: Started cri-containerd-034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907.scope - libcontainer container 034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907. Jan 23 18:31:55.199995 systemd-networkd[1578]: cali031e7520793: Link UP Jan 23 18:31:55.201605 systemd-networkd[1578]: cali031e7520793: Gained carrier Jan 23 18:31:55.214949 containerd[1684]: 2026-01-23 18:31:55.000 [INFO][4293] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0 coredns-668d6bf9bc- kube-system 87ab8ef8-50a9-48bb-8cd1-1b2e13486180 848 0 2026-01-23 18:31:17 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-1-0-c-e2d32aff86 coredns-668d6bf9bc-nqrcz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali031e7520793 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Namespace="kube-system" Pod="coredns-668d6bf9bc-nqrcz" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-" Jan 23 18:31:55.214949 containerd[1684]: 2026-01-23 18:31:55.001 [INFO][4293] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Namespace="kube-system" Pod="coredns-668d6bf9bc-nqrcz" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" Jan 23 18:31:55.214949 containerd[1684]: 2026-01-23 18:31:55.054 [INFO][4321] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" HandleID="k8s-pod-network.cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Workload="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" Jan 23 18:31:55.215099 containerd[1684]: 2026-01-23 18:31:55.055 [INFO][4321] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" HandleID="k8s-pod-network.cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Workload="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f2f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-1-0-c-e2d32aff86", "pod":"coredns-668d6bf9bc-nqrcz", "timestamp":"2026-01-23 18:31:55.05465023 +0000 UTC"}, Hostname:"ci-4547-1-0-c-e2d32aff86", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:55.215099 containerd[1684]: 2026-01-23 18:31:55.055 [INFO][4321] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:55.215099 containerd[1684]: 2026-01-23 18:31:55.089 [INFO][4321] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:55.215099 containerd[1684]: 2026-01-23 18:31:55.089 [INFO][4321] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-c-e2d32aff86' Jan 23 18:31:55.215099 containerd[1684]: 2026-01-23 18:31:55.162 [INFO][4321] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.215099 containerd[1684]: 2026-01-23 18:31:55.168 [INFO][4321] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.215099 containerd[1684]: 2026-01-23 18:31:55.173 [INFO][4321] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.215099 containerd[1684]: 2026-01-23 18:31:55.175 [INFO][4321] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.215099 containerd[1684]: 2026-01-23 18:31:55.177 [INFO][4321] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.216001 containerd[1684]: 2026-01-23 18:31:55.177 [INFO][4321] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.216001 containerd[1684]: 2026-01-23 18:31:55.179 [INFO][4321] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc Jan 23 18:31:55.216001 containerd[1684]: 2026-01-23 18:31:55.185 [INFO][4321] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.216001 containerd[1684]: 2026-01-23 18:31:55.193 [INFO][4321] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.67/26] block=192.168.100.64/26 handle="k8s-pod-network.cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.216001 containerd[1684]: 2026-01-23 18:31:55.193 [INFO][4321] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.67/26] handle="k8s-pod-network.cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:55.216001 containerd[1684]: 2026-01-23 18:31:55.193 [INFO][4321] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:55.216001 containerd[1684]: 2026-01-23 18:31:55.194 [INFO][4321] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.67/26] IPv6=[] ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" HandleID="k8s-pod-network.cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Workload="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" Jan 23 18:31:55.216113 containerd[1684]: 2026-01-23 18:31:55.197 [INFO][4293] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Namespace="kube-system" Pod="coredns-668d6bf9bc-nqrcz" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"87ab8ef8-50a9-48bb-8cd1-1b2e13486180", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"", Pod:"coredns-668d6bf9bc-nqrcz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali031e7520793", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:55.216113 containerd[1684]: 2026-01-23 18:31:55.197 [INFO][4293] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.67/32] ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Namespace="kube-system" Pod="coredns-668d6bf9bc-nqrcz" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" Jan 23 18:31:55.216113 containerd[1684]: 2026-01-23 18:31:55.197 [INFO][4293] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali031e7520793 ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Namespace="kube-system" Pod="coredns-668d6bf9bc-nqrcz" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" Jan 23 18:31:55.216113 containerd[1684]: 2026-01-23 18:31:55.200 [INFO][4293] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Namespace="kube-system" Pod="coredns-668d6bf9bc-nqrcz" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" Jan 23 18:31:55.216113 containerd[1684]: 2026-01-23 18:31:55.201 [INFO][4293] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Namespace="kube-system" Pod="coredns-668d6bf9bc-nqrcz" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"87ab8ef8-50a9-48bb-8cd1-1b2e13486180", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc", Pod:"coredns-668d6bf9bc-nqrcz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali031e7520793", MAC:"9e:ba:98:02:65:fd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:55.216113 containerd[1684]: 2026-01-23 18:31:55.210 [INFO][4293] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" Namespace="kube-system" Pod="coredns-668d6bf9bc-nqrcz" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--nqrcz-eth0" Jan 23 18:31:55.219000 audit: BPF prog-id=211 op=LOAD Jan 23 18:31:55.220000 audit: BPF prog-id=212 op=LOAD Jan 23 18:31:55.220000 audit[4361]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4350 pid=4361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.220000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033343032386665316230313464323932326533303364343535363631 Jan 23 18:31:55.220000 audit: BPF prog-id=212 op=UNLOAD Jan 23 18:31:55.220000 audit[4361]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4350 pid=4361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.220000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033343032386665316230313464323932326533303364343535363631 Jan 23 18:31:55.220000 audit: BPF prog-id=213 op=LOAD Jan 23 18:31:55.220000 audit[4361]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4350 pid=4361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.220000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033343032386665316230313464323932326533303364343535363631 Jan 23 18:31:55.220000 audit: BPF prog-id=214 op=LOAD Jan 23 18:31:55.220000 audit[4361]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4350 pid=4361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.220000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033343032386665316230313464323932326533303364343535363631 Jan 23 18:31:55.220000 audit: BPF prog-id=214 op=UNLOAD Jan 23 18:31:55.220000 audit[4361]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4350 pid=4361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.220000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033343032386665316230313464323932326533303364343535363631 Jan 23 18:31:55.220000 audit: BPF prog-id=213 op=UNLOAD Jan 23 18:31:55.220000 audit[4361]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4350 pid=4361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.220000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033343032386665316230313464323932326533303364343535363631 Jan 23 18:31:55.220000 audit: BPF prog-id=215 op=LOAD Jan 23 18:31:55.220000 audit[4361]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4350 pid=4361 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.220000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3033343032386665316230313464323932326533303364343535363631 Jan 23 18:31:55.232000 audit[4388]: NETFILTER_CFG table=filter:126 family=2 entries=52 op=nft_register_chain pid=4388 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:55.232000 audit[4388]: SYSCALL arch=c000003e syscall=46 success=yes exit=26592 a0=3 a1=7ffeae3f58d0 a2=0 a3=7ffeae3f58bc items=0 ppid=4111 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.232000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:55.248989 containerd[1684]: time="2026-01-23T18:31:55.248699959Z" level=info msg="connecting to shim cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc" address="unix:///run/containerd/s/7bd27f789aabf4ce813fea9d192d59e57e344212b349b31988721251d0973c59" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:55.262498 containerd[1684]: time="2026-01-23T18:31:55.262464679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-548cd799bc-rm88n,Uid:8bfceda8-a381-44e3-ab5c-8d6cd2f190e0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"034028fe1b014d2922e303d45566122ffbd78a0689caf17981a69d7ff6246907\"" Jan 23 18:31:55.264242 containerd[1684]: time="2026-01-23T18:31:55.263750886Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:31:55.279109 systemd[1]: Started cri-containerd-cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc.scope - libcontainer container cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc. Jan 23 18:31:55.286000 audit: BPF prog-id=216 op=LOAD Jan 23 18:31:55.286000 audit: BPF prog-id=217 op=LOAD Jan 23 18:31:55.286000 audit[4416]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4399 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365636634613936623634313435663936363934386438313636373132 Jan 23 18:31:55.286000 audit: BPF prog-id=217 op=UNLOAD Jan 23 18:31:55.286000 audit[4416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4399 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365636634613936623634313435663936363934386438313636373132 Jan 23 18:31:55.286000 audit: BPF prog-id=218 op=LOAD Jan 23 18:31:55.286000 audit[4416]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4399 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.286000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365636634613936623634313435663936363934386438313636373132 Jan 23 18:31:55.287000 audit: BPF prog-id=219 op=LOAD Jan 23 18:31:55.287000 audit[4416]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4399 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365636634613936623634313435663936363934386438313636373132 Jan 23 18:31:55.287000 audit: BPF prog-id=219 op=UNLOAD Jan 23 18:31:55.287000 audit[4416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4399 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365636634613936623634313435663936363934386438313636373132 Jan 23 18:31:55.287000 audit: BPF prog-id=218 op=UNLOAD Jan 23 18:31:55.287000 audit[4416]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4399 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365636634613936623634313435663936363934386438313636373132 Jan 23 18:31:55.287000 audit: BPF prog-id=220 op=LOAD Jan 23 18:31:55.287000 audit[4416]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4399 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.287000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365636634613936623634313435663936363934386438313636373132 Jan 23 18:31:55.317861 containerd[1684]: time="2026-01-23T18:31:55.317835318Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-nqrcz,Uid:87ab8ef8-50a9-48bb-8cd1-1b2e13486180,Namespace:kube-system,Attempt:0,} returns sandbox id \"cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc\"" Jan 23 18:31:55.320008 containerd[1684]: time="2026-01-23T18:31:55.319990639Z" level=info msg="CreateContainer within sandbox \"cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 18:31:55.330092 containerd[1684]: time="2026-01-23T18:31:55.330062840Z" level=info msg="Container 59d242aae512f6c17393dc8a9759938736333d8e6c8e29a16e1a75131f618f16: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:55.333314 containerd[1684]: time="2026-01-23T18:31:55.333262977Z" level=info msg="CreateContainer within sandbox \"cecf4a96b64145f966948d816671269410630e78012f59b9ef90c727a1a8e3cc\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"59d242aae512f6c17393dc8a9759938736333d8e6c8e29a16e1a75131f618f16\"" Jan 23 18:31:55.333830 containerd[1684]: time="2026-01-23T18:31:55.333775159Z" level=info msg="StartContainer for \"59d242aae512f6c17393dc8a9759938736333d8e6c8e29a16e1a75131f618f16\"" Jan 23 18:31:55.334888 containerd[1684]: time="2026-01-23T18:31:55.334867034Z" level=info msg="connecting to shim 59d242aae512f6c17393dc8a9759938736333d8e6c8e29a16e1a75131f618f16" address="unix:///run/containerd/s/7bd27f789aabf4ce813fea9d192d59e57e344212b349b31988721251d0973c59" protocol=ttrpc version=3 Jan 23 18:31:55.352106 systemd[1]: Started cri-containerd-59d242aae512f6c17393dc8a9759938736333d8e6c8e29a16e1a75131f618f16.scope - libcontainer container 59d242aae512f6c17393dc8a9759938736333d8e6c8e29a16e1a75131f618f16. Jan 23 18:31:55.362000 audit: BPF prog-id=221 op=LOAD Jan 23 18:31:55.362000 audit: BPF prog-id=222 op=LOAD Jan 23 18:31:55.362000 audit[4442]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4399 pid=4442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.362000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539643234326161653531326636633137333933646338613937353939 Jan 23 18:31:55.363000 audit: BPF prog-id=222 op=UNLOAD Jan 23 18:31:55.363000 audit[4442]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4399 pid=4442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539643234326161653531326636633137333933646338613937353939 Jan 23 18:31:55.363000 audit: BPF prog-id=223 op=LOAD Jan 23 18:31:55.363000 audit[4442]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4399 pid=4442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539643234326161653531326636633137333933646338613937353939 Jan 23 18:31:55.363000 audit: BPF prog-id=224 op=LOAD Jan 23 18:31:55.363000 audit[4442]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4399 pid=4442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539643234326161653531326636633137333933646338613937353939 Jan 23 18:31:55.363000 audit: BPF prog-id=224 op=UNLOAD Jan 23 18:31:55.363000 audit[4442]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4399 pid=4442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539643234326161653531326636633137333933646338613937353939 Jan 23 18:31:55.363000 audit: BPF prog-id=223 op=UNLOAD Jan 23 18:31:55.363000 audit[4442]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4399 pid=4442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539643234326161653531326636633137333933646338613937353939 Jan 23 18:31:55.363000 audit: BPF prog-id=225 op=LOAD Jan 23 18:31:55.363000 audit[4442]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4399 pid=4442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:55.363000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539643234326161653531326636633137333933646338613937353939 Jan 23 18:31:55.377144 containerd[1684]: time="2026-01-23T18:31:55.377119388Z" level=info msg="StartContainer for \"59d242aae512f6c17393dc8a9759938736333d8e6c8e29a16e1a75131f618f16\" returns successfully" Jan 23 18:31:55.404109 systemd-networkd[1578]: vxlan.calico: Gained IPv6LL Jan 23 18:31:55.795528 containerd[1684]: time="2026-01-23T18:31:55.795293089Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:55.797130 containerd[1684]: time="2026-01-23T18:31:55.797051698Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:31:55.797221 containerd[1684]: time="2026-01-23T18:31:55.797168119Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:55.797532 kubelet[2852]: E0123 18:31:55.797445 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:55.797532 kubelet[2852]: E0123 18:31:55.797508 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:55.797730 kubelet[2852]: E0123 18:31:55.797670 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2whq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-548cd799bc-rm88n_calico-apiserver(8bfceda8-a381-44e3-ab5c-8d6cd2f190e0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:55.799108 kubelet[2852]: E0123 18:31:55.799031 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:31:55.912729 containerd[1684]: time="2026-01-23T18:31:55.912058669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-dw7bl,Uid:8cfde420-1102-4e10-b36c-f5766b852cc7,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:55.912969 containerd[1684]: time="2026-01-23T18:31:55.912892583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-rtp4c,Uid:43b5afc6-15bc-4463-a623-3577c228380b,Namespace:kube-system,Attempt:0,}" Jan 23 18:31:56.098803 systemd-networkd[1578]: calie155e633396: Link UP Jan 23 18:31:56.100458 systemd-networkd[1578]: calie155e633396: Gained carrier Jan 23 18:31:56.115026 kubelet[2852]: E0123 18:31:56.113697 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:55.997 [INFO][4480] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0 goldmane-666569f655- calico-system 8cfde420-1102-4e10-b36c-f5766b852cc7 847 0 2026-01-23 18:31:28 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-1-0-c-e2d32aff86 goldmane-666569f655-dw7bl eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calie155e633396 [] [] }} ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Namespace="calico-system" Pod="goldmane-666569f655-dw7bl" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:55.997 [INFO][4480] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Namespace="calico-system" Pod="goldmane-666569f655-dw7bl" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.039 [INFO][4501] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" HandleID="k8s-pod-network.80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Workload="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.039 [INFO][4501] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" HandleID="k8s-pod-network.80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Workload="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-c-e2d32aff86", "pod":"goldmane-666569f655-dw7bl", "timestamp":"2026-01-23 18:31:56.039383541 +0000 UTC"}, Hostname:"ci-4547-1-0-c-e2d32aff86", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.039 [INFO][4501] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.039 [INFO][4501] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.039 [INFO][4501] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-c-e2d32aff86' Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.050 [INFO][4501] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.057 [INFO][4501] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.063 [INFO][4501] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.066 [INFO][4501] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.069 [INFO][4501] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.069 [INFO][4501] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.071 [INFO][4501] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.076 [INFO][4501] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.085 [INFO][4501] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.68/26] block=192.168.100.64/26 handle="k8s-pod-network.80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.086 [INFO][4501] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.68/26] handle="k8s-pod-network.80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.086 [INFO][4501] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:56.144436 containerd[1684]: 2026-01-23 18:31:56.086 [INFO][4501] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.68/26] IPv6=[] ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" HandleID="k8s-pod-network.80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Workload="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" Jan 23 18:31:56.150327 containerd[1684]: 2026-01-23 18:31:56.090 [INFO][4480] cni-plugin/k8s.go 418: Populated endpoint ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Namespace="calico-system" Pod="goldmane-666569f655-dw7bl" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"8cfde420-1102-4e10-b36c-f5766b852cc7", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"", Pod:"goldmane-666569f655-dw7bl", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.100.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie155e633396", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:56.150327 containerd[1684]: 2026-01-23 18:31:56.090 [INFO][4480] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.68/32] ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Namespace="calico-system" Pod="goldmane-666569f655-dw7bl" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" Jan 23 18:31:56.150327 containerd[1684]: 2026-01-23 18:31:56.091 [INFO][4480] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie155e633396 ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Namespace="calico-system" Pod="goldmane-666569f655-dw7bl" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" Jan 23 18:31:56.150327 containerd[1684]: 2026-01-23 18:31:56.102 [INFO][4480] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Namespace="calico-system" Pod="goldmane-666569f655-dw7bl" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" Jan 23 18:31:56.150327 containerd[1684]: 2026-01-23 18:31:56.103 [INFO][4480] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Namespace="calico-system" Pod="goldmane-666569f655-dw7bl" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"8cfde420-1102-4e10-b36c-f5766b852cc7", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c", Pod:"goldmane-666569f655-dw7bl", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.100.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calie155e633396", MAC:"a2:b1:65:a1:bf:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:56.150327 containerd[1684]: 2026-01-23 18:31:56.129 [INFO][4480] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" Namespace="calico-system" Pod="goldmane-666569f655-dw7bl" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-goldmane--666569f655--dw7bl-eth0" Jan 23 18:31:56.162898 kubelet[2852]: I0123 18:31:56.162858 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-nqrcz" podStartSLOduration=39.162845598 podStartE2EDuration="39.162845598s" podCreationTimestamp="2026-01-23 18:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:31:56.140099961 +0000 UTC m=+46.333133891" watchObservedRunningTime="2026-01-23 18:31:56.162845598 +0000 UTC m=+46.355879538" Jan 23 18:31:56.165000 audit[4521]: NETFILTER_CFG table=filter:127 family=2 entries=20 op=nft_register_rule pid=4521 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:56.165000 audit[4521]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffdbc2f2e10 a2=0 a3=7ffdbc2f2dfc items=0 ppid=2960 pid=4521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.165000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:56.170000 audit[4521]: NETFILTER_CFG table=nat:128 family=2 entries=14 op=nft_register_rule pid=4521 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:56.170000 audit[4521]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffdbc2f2e10 a2=0 a3=0 items=0 ppid=2960 pid=4521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.170000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:56.179000 audit[4525]: NETFILTER_CFG table=filter:129 family=2 entries=48 op=nft_register_chain pid=4525 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:56.179000 audit[4525]: SYSCALL arch=c000003e syscall=46 success=yes exit=26352 a0=3 a1=7ffc8b0faf30 a2=0 a3=7ffc8b0faf1c items=0 ppid=4111 pid=4525 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.179000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:56.197143 containerd[1684]: time="2026-01-23T18:31:56.196921630Z" level=info msg="connecting to shim 80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c" address="unix:///run/containerd/s/d86689887c788baf18a0a83efc314e4c3874ab990960f9af006a7dea27086ab4" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:56.203000 audit[4536]: NETFILTER_CFG table=filter:130 family=2 entries=20 op=nft_register_rule pid=4536 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:56.203000 audit[4536]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffecbf61380 a2=0 a3=7ffecbf6136c items=0 ppid=2960 pid=4536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.203000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:56.206000 audit[4536]: NETFILTER_CFG table=nat:131 family=2 entries=14 op=nft_register_rule pid=4536 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:56.206000 audit[4536]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffecbf61380 a2=0 a3=0 items=0 ppid=2960 pid=4536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.206000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:56.227252 systemd-networkd[1578]: cali6316008327a: Link UP Jan 23 18:31:56.228079 systemd-networkd[1578]: cali6316008327a: Gained carrier Jan 23 18:31:56.236398 systemd-networkd[1578]: cali0b80eb4ba88: Gained IPv6LL Jan 23 18:31:56.243121 systemd[1]: Started cri-containerd-80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c.scope - libcontainer container 80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c. Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.002 [INFO][4479] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0 coredns-668d6bf9bc- kube-system 43b5afc6-15bc-4463-a623-3577c228380b 839 0 2026-01-23 18:31:17 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-1-0-c-e2d32aff86 coredns-668d6bf9bc-rtp4c eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6316008327a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Namespace="kube-system" Pod="coredns-668d6bf9bc-rtp4c" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.003 [INFO][4479] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Namespace="kube-system" Pod="coredns-668d6bf9bc-rtp4c" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.064 [INFO][4503] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" HandleID="k8s-pod-network.6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Workload="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.065 [INFO][4503] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" HandleID="k8s-pod-network.6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Workload="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103680), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-1-0-c-e2d32aff86", "pod":"coredns-668d6bf9bc-rtp4c", "timestamp":"2026-01-23 18:31:56.064747091 +0000 UTC"}, Hostname:"ci-4547-1-0-c-e2d32aff86", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.065 [INFO][4503] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.086 [INFO][4503] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.086 [INFO][4503] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-c-e2d32aff86' Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.153 [INFO][4503] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.168 [INFO][4503] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.174 [INFO][4503] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.188 [INFO][4503] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.195 [INFO][4503] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.196 [INFO][4503] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.203 [INFO][4503] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908 Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.214 [INFO][4503] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.220 [INFO][4503] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.69/26] block=192.168.100.64/26 handle="k8s-pod-network.6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.220 [INFO][4503] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.69/26] handle="k8s-pod-network.6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.220 [INFO][4503] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:56.251034 containerd[1684]: 2026-01-23 18:31:56.220 [INFO][4503] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.69/26] IPv6=[] ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" HandleID="k8s-pod-network.6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Workload="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" Jan 23 18:31:56.251467 containerd[1684]: 2026-01-23 18:31:56.222 [INFO][4479] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Namespace="kube-system" Pod="coredns-668d6bf9bc-rtp4c" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"43b5afc6-15bc-4463-a623-3577c228380b", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"", Pod:"coredns-668d6bf9bc-rtp4c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6316008327a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:56.251467 containerd[1684]: 2026-01-23 18:31:56.223 [INFO][4479] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.69/32] ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Namespace="kube-system" Pod="coredns-668d6bf9bc-rtp4c" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" Jan 23 18:31:56.251467 containerd[1684]: 2026-01-23 18:31:56.223 [INFO][4479] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6316008327a ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Namespace="kube-system" Pod="coredns-668d6bf9bc-rtp4c" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" Jan 23 18:31:56.251467 containerd[1684]: 2026-01-23 18:31:56.228 [INFO][4479] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Namespace="kube-system" Pod="coredns-668d6bf9bc-rtp4c" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" Jan 23 18:31:56.251467 containerd[1684]: 2026-01-23 18:31:56.229 [INFO][4479] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Namespace="kube-system" Pod="coredns-668d6bf9bc-rtp4c" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"43b5afc6-15bc-4463-a623-3577c228380b", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908", Pod:"coredns-668d6bf9bc-rtp4c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6316008327a", MAC:"1e:da:d6:83:03:e7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:56.251467 containerd[1684]: 2026-01-23 18:31:56.241 [INFO][4479] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" Namespace="kube-system" Pod="coredns-668d6bf9bc-rtp4c" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-coredns--668d6bf9bc--rtp4c-eth0" Jan 23 18:31:56.267000 audit: BPF prog-id=226 op=LOAD Jan 23 18:31:56.268000 audit: BPF prog-id=227 op=LOAD Jan 23 18:31:56.268000 audit[4547]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=4535 pid=4547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830633466386532356566363561353462646565336163333363623438 Jan 23 18:31:56.268000 audit: BPF prog-id=227 op=UNLOAD Jan 23 18:31:56.268000 audit[4547]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4535 pid=4547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830633466386532356566363561353462646565336163333363623438 Jan 23 18:31:56.268000 audit: BPF prog-id=228 op=LOAD Jan 23 18:31:56.268000 audit[4547]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=4535 pid=4547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830633466386532356566363561353462646565336163333363623438 Jan 23 18:31:56.268000 audit: BPF prog-id=229 op=LOAD Jan 23 18:31:56.268000 audit[4547]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=4535 pid=4547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830633466386532356566363561353462646565336163333363623438 Jan 23 18:31:56.268000 audit: BPF prog-id=229 op=UNLOAD Jan 23 18:31:56.268000 audit[4547]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4535 pid=4547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830633466386532356566363561353462646565336163333363623438 Jan 23 18:31:56.268000 audit: BPF prog-id=228 op=UNLOAD Jan 23 18:31:56.268000 audit[4547]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4535 pid=4547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830633466386532356566363561353462646565336163333363623438 Jan 23 18:31:56.268000 audit: BPF prog-id=230 op=LOAD Jan 23 18:31:56.268000 audit[4547]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=4535 pid=4547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830633466386532356566363561353462646565336163333363623438 Jan 23 18:31:56.285689 containerd[1684]: time="2026-01-23T18:31:56.285519032Z" level=info msg="connecting to shim 6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908" address="unix:///run/containerd/s/1d10b38d409c0f4648867c239fb9ad41c5673d695af18a7f82ca63860109cc87" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:56.284000 audit[4576]: NETFILTER_CFG table=filter:132 family=2 entries=40 op=nft_register_chain pid=4576 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:56.284000 audit[4576]: SYSCALL arch=c000003e syscall=46 success=yes exit=20328 a0=3 a1=7ffcbe60d2d0 a2=0 a3=7ffcbe60d2bc items=0 ppid=4111 pid=4576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.284000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:56.309238 systemd[1]: Started cri-containerd-6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908.scope - libcontainer container 6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908. Jan 23 18:31:56.311905 containerd[1684]: time="2026-01-23T18:31:56.311830347Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-dw7bl,Uid:8cfde420-1102-4e10-b36c-f5766b852cc7,Namespace:calico-system,Attempt:0,} returns sandbox id \"80c4f8e25ef65a54bdee3ac33cb48838543f582b4248d6ea1e21dd7ea3d6de9c\"" Jan 23 18:31:56.315297 containerd[1684]: time="2026-01-23T18:31:56.314550740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 18:31:56.319000 audit: BPF prog-id=231 op=LOAD Jan 23 18:31:56.319000 audit: BPF prog-id=232 op=LOAD Jan 23 18:31:56.319000 audit[4595]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00022c238 a2=98 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.319000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665393437306537366331663766326263336131396133303732626431 Jan 23 18:31:56.319000 audit: BPF prog-id=232 op=UNLOAD Jan 23 18:31:56.319000 audit[4595]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.319000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665393437306537366331663766326263336131396133303732626431 Jan 23 18:31:56.320000 audit: BPF prog-id=233 op=LOAD Jan 23 18:31:56.320000 audit[4595]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00022c488 a2=98 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665393437306537366331663766326263336131396133303732626431 Jan 23 18:31:56.320000 audit: BPF prog-id=234 op=LOAD Jan 23 18:31:56.320000 audit[4595]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00022c218 a2=98 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665393437306537366331663766326263336131396133303732626431 Jan 23 18:31:56.320000 audit: BPF prog-id=234 op=UNLOAD Jan 23 18:31:56.320000 audit[4595]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665393437306537366331663766326263336131396133303732626431 Jan 23 18:31:56.320000 audit: BPF prog-id=233 op=UNLOAD Jan 23 18:31:56.320000 audit[4595]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665393437306537366331663766326263336131396133303732626431 Jan 23 18:31:56.320000 audit: BPF prog-id=235 op=LOAD Jan 23 18:31:56.320000 audit[4595]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00022c6e8 a2=98 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665393437306537366331663766326263336131396133303732626431 Jan 23 18:31:56.349611 containerd[1684]: time="2026-01-23T18:31:56.349474717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-rtp4c,Uid:43b5afc6-15bc-4463-a623-3577c228380b,Namespace:kube-system,Attempt:0,} returns sandbox id \"6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908\"" Jan 23 18:31:56.353204 containerd[1684]: time="2026-01-23T18:31:56.352752932Z" level=info msg="CreateContainer within sandbox \"6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 18:31:56.359435 containerd[1684]: time="2026-01-23T18:31:56.359418624Z" level=info msg="Container 3473b20189f2c792084566f42e23a8fae44864c8bb4e6dcb303fea48850d71b4: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:31:56.363305 containerd[1684]: time="2026-01-23T18:31:56.363271053Z" level=info msg="CreateContainer within sandbox \"6e9470e76c1f7f2bc3a19a3072bd14fbe1748491e3de2b5308491cb3338cd908\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3473b20189f2c792084566f42e23a8fae44864c8bb4e6dcb303fea48850d71b4\"" Jan 23 18:31:56.364040 containerd[1684]: time="2026-01-23T18:31:56.363509534Z" level=info msg="StartContainer for \"3473b20189f2c792084566f42e23a8fae44864c8bb4e6dcb303fea48850d71b4\"" Jan 23 18:31:56.364281 containerd[1684]: time="2026-01-23T18:31:56.364266377Z" level=info msg="connecting to shim 3473b20189f2c792084566f42e23a8fae44864c8bb4e6dcb303fea48850d71b4" address="unix:///run/containerd/s/1d10b38d409c0f4648867c239fb9ad41c5673d695af18a7f82ca63860109cc87" protocol=ttrpc version=3 Jan 23 18:31:56.380108 systemd[1]: Started cri-containerd-3473b20189f2c792084566f42e23a8fae44864c8bb4e6dcb303fea48850d71b4.scope - libcontainer container 3473b20189f2c792084566f42e23a8fae44864c8bb4e6dcb303fea48850d71b4. Jan 23 18:31:56.389000 audit: BPF prog-id=236 op=LOAD Jan 23 18:31:56.389000 audit: BPF prog-id=237 op=LOAD Jan 23 18:31:56.389000 audit[4629]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4584 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.389000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334373362323031383966326337393230383435363666343265323361 Jan 23 18:31:56.389000 audit: BPF prog-id=237 op=UNLOAD Jan 23 18:31:56.389000 audit[4629]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.389000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334373362323031383966326337393230383435363666343265323361 Jan 23 18:31:56.389000 audit: BPF prog-id=238 op=LOAD Jan 23 18:31:56.389000 audit[4629]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4584 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.389000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334373362323031383966326337393230383435363666343265323361 Jan 23 18:31:56.389000 audit: BPF prog-id=239 op=LOAD Jan 23 18:31:56.389000 audit[4629]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4584 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.389000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334373362323031383966326337393230383435363666343265323361 Jan 23 18:31:56.389000 audit: BPF prog-id=239 op=UNLOAD Jan 23 18:31:56.389000 audit[4629]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.389000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334373362323031383966326337393230383435363666343265323361 Jan 23 18:31:56.389000 audit: BPF prog-id=238 op=UNLOAD Jan 23 18:31:56.389000 audit[4629]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.389000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334373362323031383966326337393230383435363666343265323361 Jan 23 18:31:56.390000 audit: BPF prog-id=240 op=LOAD Jan 23 18:31:56.390000 audit[4629]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4584 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:56.390000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3334373362323031383966326337393230383435363666343265323361 Jan 23 18:31:56.405339 containerd[1684]: time="2026-01-23T18:31:56.405270932Z" level=info msg="StartContainer for \"3473b20189f2c792084566f42e23a8fae44864c8bb4e6dcb303fea48850d71b4\" returns successfully" Jan 23 18:31:56.724527 containerd[1684]: time="2026-01-23T18:31:56.724342701Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:56.726891 containerd[1684]: time="2026-01-23T18:31:56.726681532Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 18:31:56.727117 kubelet[2852]: E0123 18:31:56.727036 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:31:56.727117 kubelet[2852]: E0123 18:31:56.727102 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:31:56.727415 containerd[1684]: time="2026-01-23T18:31:56.726789972Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:56.727508 kubelet[2852]: E0123 18:31:56.727266 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x97s9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-dw7bl_calico-system(8cfde420-1102-4e10-b36c-f5766b852cc7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:56.728837 kubelet[2852]: E0123 18:31:56.728783 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:31:56.876298 systemd-networkd[1578]: cali031e7520793: Gained IPv6LL Jan 23 18:31:56.913517 containerd[1684]: time="2026-01-23T18:31:56.913428840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6z82s,Uid:f61eaf28-593a-461f-8945-a34eecb93534,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:57.127354 kubelet[2852]: E0123 18:31:57.127276 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:31:57.128410 kubelet[2852]: E0123 18:31:57.127707 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:31:57.147889 kubelet[2852]: I0123 18:31:57.147677 2852 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-rtp4c" podStartSLOduration=40.147660705 podStartE2EDuration="40.147660705s" podCreationTimestamp="2026-01-23 18:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 18:31:57.145234374 +0000 UTC m=+47.338268324" watchObservedRunningTime="2026-01-23 18:31:57.147660705 +0000 UTC m=+47.340694665" Jan 23 18:31:57.172739 systemd-networkd[1578]: cali6e7fcd0054f: Link UP Jan 23 18:31:57.175041 systemd-networkd[1578]: cali6e7fcd0054f: Gained carrier Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.023 [INFO][4662] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0 csi-node-driver- calico-system f61eaf28-593a-461f-8945-a34eecb93534 730 0 2026-01-23 18:31:30 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-1-0-c-e2d32aff86 csi-node-driver-6z82s eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali6e7fcd0054f [] [] }} ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Namespace="calico-system" Pod="csi-node-driver-6z82s" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.024 [INFO][4662] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Namespace="calico-system" Pod="csi-node-driver-6z82s" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.079 [INFO][4674] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" HandleID="k8s-pod-network.fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Workload="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.079 [INFO][4674] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" HandleID="k8s-pod-network.fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Workload="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cb7e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-c-e2d32aff86", "pod":"csi-node-driver-6z82s", "timestamp":"2026-01-23 18:31:57.079097788 +0000 UTC"}, Hostname:"ci-4547-1-0-c-e2d32aff86", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.079 [INFO][4674] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.079 [INFO][4674] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.079 [INFO][4674] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-c-e2d32aff86' Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.096 [INFO][4674] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.103 [INFO][4674] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.109 [INFO][4674] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.114 [INFO][4674] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.120 [INFO][4674] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.121 [INFO][4674] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.125 [INFO][4674] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.136 [INFO][4674] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.154 [INFO][4674] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.70/26] block=192.168.100.64/26 handle="k8s-pod-network.fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.154 [INFO][4674] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.70/26] handle="k8s-pod-network.fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.154 [INFO][4674] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:57.201024 containerd[1684]: 2026-01-23 18:31:57.154 [INFO][4674] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.70/26] IPv6=[] ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" HandleID="k8s-pod-network.fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Workload="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" Jan 23 18:31:57.201637 containerd[1684]: 2026-01-23 18:31:57.163 [INFO][4662] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Namespace="calico-system" Pod="csi-node-driver-6z82s" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f61eaf28-593a-461f-8945-a34eecb93534", ResourceVersion:"730", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"", Pod:"csi-node-driver-6z82s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.100.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6e7fcd0054f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:57.201637 containerd[1684]: 2026-01-23 18:31:57.163 [INFO][4662] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.70/32] ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Namespace="calico-system" Pod="csi-node-driver-6z82s" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" Jan 23 18:31:57.201637 containerd[1684]: 2026-01-23 18:31:57.163 [INFO][4662] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6e7fcd0054f ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Namespace="calico-system" Pod="csi-node-driver-6z82s" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" Jan 23 18:31:57.201637 containerd[1684]: 2026-01-23 18:31:57.169 [INFO][4662] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Namespace="calico-system" Pod="csi-node-driver-6z82s" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" Jan 23 18:31:57.201637 containerd[1684]: 2026-01-23 18:31:57.170 [INFO][4662] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Namespace="calico-system" Pod="csi-node-driver-6z82s" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f61eaf28-593a-461f-8945-a34eecb93534", ResourceVersion:"730", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab", Pod:"csi-node-driver-6z82s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.100.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6e7fcd0054f", MAC:"32:94:18:0b:6b:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:57.201637 containerd[1684]: 2026-01-23 18:31:57.195 [INFO][4662] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" Namespace="calico-system" Pod="csi-node-driver-6z82s" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-csi--node--driver--6z82s-eth0" Jan 23 18:31:57.239495 kernel: kauditd_printk_skb: 387 callbacks suppressed Jan 23 18:31:57.239608 kernel: audit: type=1325 audit(1769193117.231:724): table=filter:133 family=2 entries=17 op=nft_register_rule pid=4688 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:57.231000 audit[4688]: NETFILTER_CFG table=filter:133 family=2 entries=17 op=nft_register_rule pid=4688 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:57.231000 audit[4688]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffcac7f59c0 a2=0 a3=7ffcac7f59ac items=0 ppid=2960 pid=4688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.247006 kernel: audit: type=1300 audit(1769193117.231:724): arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffcac7f59c0 a2=0 a3=7ffcac7f59ac items=0 ppid=2960 pid=4688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.231000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:57.248470 containerd[1684]: time="2026-01-23T18:31:57.248422807Z" level=info msg="connecting to shim fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab" address="unix:///run/containerd/s/3d5e2e61225f9a4a7149cbd29a1988819d84c5cbf55056594c6c5c9776691847" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:57.252341 kernel: audit: type=1327 audit(1769193117.231:724): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:57.252000 audit[4691]: NETFILTER_CFG table=filter:134 family=2 entries=54 op=nft_register_chain pid=4691 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:57.252000 audit[4691]: SYSCALL arch=c000003e syscall=46 success=yes exit=25976 a0=3 a1=7ffcfff0c7a0 a2=0 a3=7ffcfff0c78c items=0 ppid=4111 pid=4691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.258247 kernel: audit: type=1325 audit(1769193117.252:725): table=filter:134 family=2 entries=54 op=nft_register_chain pid=4691 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:57.258280 kernel: audit: type=1300 audit(1769193117.252:725): arch=c000003e syscall=46 success=yes exit=25976 a0=3 a1=7ffcfff0c7a0 a2=0 a3=7ffcfff0c78c items=0 ppid=4111 pid=4691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.252000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:57.266721 kernel: audit: type=1327 audit(1769193117.252:725): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:57.271731 kernel: audit: type=1325 audit(1769193117.255:726): table=nat:135 family=2 entries=35 op=nft_register_chain pid=4688 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:57.255000 audit[4688]: NETFILTER_CFG table=nat:135 family=2 entries=35 op=nft_register_chain pid=4688 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:57.278992 kernel: audit: type=1300 audit(1769193117.255:726): arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffcac7f59c0 a2=0 a3=0 items=0 ppid=2960 pid=4688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.255000 audit[4688]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffcac7f59c0 a2=0 a3=0 items=0 ppid=2960 pid=4688 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.255000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:57.283585 systemd[1]: Started cri-containerd-fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab.scope - libcontainer container fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab. Jan 23 18:31:57.283997 kernel: audit: type=1327 audit(1769193117.255:726): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:57.294000 audit: BPF prog-id=241 op=LOAD Jan 23 18:31:57.294000 audit: BPF prog-id=242 op=LOAD Jan 23 18:31:57.297027 kernel: audit: type=1334 audit(1769193117.294:727): prog-id=241 op=LOAD Jan 23 18:31:57.294000 audit[4709]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4699 pid=4709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661343231653465313631666333336537646335393763316231326461 Jan 23 18:31:57.294000 audit: BPF prog-id=242 op=UNLOAD Jan 23 18:31:57.294000 audit[4709]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4699 pid=4709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661343231653465313631666333336537646335393763316231326461 Jan 23 18:31:57.295000 audit: BPF prog-id=243 op=LOAD Jan 23 18:31:57.295000 audit[4709]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4699 pid=4709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.295000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661343231653465313631666333336537646335393763316231326461 Jan 23 18:31:57.295000 audit: BPF prog-id=244 op=LOAD Jan 23 18:31:57.295000 audit[4709]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4699 pid=4709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.295000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661343231653465313631666333336537646335393763316231326461 Jan 23 18:31:57.295000 audit: BPF prog-id=244 op=UNLOAD Jan 23 18:31:57.295000 audit[4709]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4699 pid=4709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.295000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661343231653465313631666333336537646335393763316231326461 Jan 23 18:31:57.295000 audit: BPF prog-id=243 op=UNLOAD Jan 23 18:31:57.295000 audit[4709]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4699 pid=4709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.295000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661343231653465313631666333336537646335393763316231326461 Jan 23 18:31:57.295000 audit: BPF prog-id=245 op=LOAD Jan 23 18:31:57.295000 audit[4709]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4699 pid=4709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:57.295000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661343231653465313631666333336537646335393763316231326461 Jan 23 18:31:57.319578 containerd[1684]: time="2026-01-23T18:31:57.319527146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-6z82s,Uid:f61eaf28-593a-461f-8945-a34eecb93534,Namespace:calico-system,Attempt:0,} returns sandbox id \"fa421e4e161fc33e7dc597c1b12da29c08f391f3f446385f87d084d51d592dab\"" Jan 23 18:31:57.321747 containerd[1684]: time="2026-01-23T18:31:57.321734206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 18:31:57.747512 containerd[1684]: time="2026-01-23T18:31:57.747441507Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:57.748920 containerd[1684]: time="2026-01-23T18:31:57.748862224Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 18:31:57.749194 containerd[1684]: time="2026-01-23T18:31:57.749087775Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:57.749257 kubelet[2852]: E0123 18:31:57.749215 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:31:57.749315 kubelet[2852]: E0123 18:31:57.749273 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:31:57.749516 kubelet[2852]: E0123 18:31:57.749436 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc26w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:57.753759 containerd[1684]: time="2026-01-23T18:31:57.753716976Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 18:31:57.901434 systemd-networkd[1578]: cali6316008327a: Gained IPv6LL Jan 23 18:31:58.092369 systemd-networkd[1578]: calie155e633396: Gained IPv6LL Jan 23 18:31:58.130655 kubelet[2852]: E0123 18:31:58.130607 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:31:58.198355 containerd[1684]: time="2026-01-23T18:31:58.198298130Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:58.199899 containerd[1684]: time="2026-01-23T18:31:58.199842347Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 18:31:58.200212 containerd[1684]: time="2026-01-23T18:31:58.199952318Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:58.200361 kubelet[2852]: E0123 18:31:58.200320 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:31:58.200520 kubelet[2852]: E0123 18:31:58.200469 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:31:58.200704 kubelet[2852]: E0123 18:31:58.200629 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc26w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:58.202125 kubelet[2852]: E0123 18:31:58.202049 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:58.284707 systemd-networkd[1578]: cali6e7fcd0054f: Gained IPv6LL Jan 23 18:31:58.295000 audit[4737]: NETFILTER_CFG table=filter:136 family=2 entries=14 op=nft_register_rule pid=4737 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:58.295000 audit[4737]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe37dbdf90 a2=0 a3=7ffe37dbdf7c items=0 ppid=2960 pid=4737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:58.295000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:58.308000 audit[4737]: NETFILTER_CFG table=nat:137 family=2 entries=56 op=nft_register_chain pid=4737 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:31:58.308000 audit[4737]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffe37dbdf90 a2=0 a3=7ffe37dbdf7c items=0 ppid=2960 pid=4737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:58.308000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:31:58.913664 containerd[1684]: time="2026-01-23T18:31:58.913380008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-548cd799bc-4tqh9,Uid:7830dd1a-252c-46ff-ba84-ba7feca691a8,Namespace:calico-apiserver,Attempt:0,}" Jan 23 18:31:58.915071 containerd[1684]: time="2026-01-23T18:31:58.914824373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69d9c854f8-4vpsg,Uid:4e218aa3-82e2-43da-97a5-0f48de07a97f,Namespace:calico-system,Attempt:0,}" Jan 23 18:31:59.111762 systemd-networkd[1578]: cali9c98483f6de: Link UP Jan 23 18:31:59.113495 systemd-networkd[1578]: cali9c98483f6de: Gained carrier Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.024 [INFO][4739] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0 calico-apiserver-548cd799bc- calico-apiserver 7830dd1a-252c-46ff-ba84-ba7feca691a8 845 0 2026-01-23 18:31:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:548cd799bc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-1-0-c-e2d32aff86 calico-apiserver-548cd799bc-4tqh9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9c98483f6de [] [] }} ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-4tqh9" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.024 [INFO][4739] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-4tqh9" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.067 [INFO][4764] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" HandleID="k8s-pod-network.ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Workload="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.067 [INFO][4764] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" HandleID="k8s-pod-network.ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Workload="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-1-0-c-e2d32aff86", "pod":"calico-apiserver-548cd799bc-4tqh9", "timestamp":"2026-01-23 18:31:59.067221953 +0000 UTC"}, Hostname:"ci-4547-1-0-c-e2d32aff86", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.067 [INFO][4764] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.067 [INFO][4764] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.067 [INFO][4764] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-c-e2d32aff86' Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.075 [INFO][4764] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.079 [INFO][4764] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.089 [INFO][4764] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.090 [INFO][4764] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.091 [INFO][4764] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.091 [INFO][4764] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.093 [INFO][4764] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33 Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.097 [INFO][4764] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.102 [INFO][4764] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.71/26] block=192.168.100.64/26 handle="k8s-pod-network.ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.102 [INFO][4764] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.71/26] handle="k8s-pod-network.ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.102 [INFO][4764] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:59.133832 containerd[1684]: 2026-01-23 18:31:59.102 [INFO][4764] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.71/26] IPv6=[] ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" HandleID="k8s-pod-network.ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Workload="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" Jan 23 18:31:59.134623 containerd[1684]: 2026-01-23 18:31:59.106 [INFO][4739] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-4tqh9" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0", GenerateName:"calico-apiserver-548cd799bc-", Namespace:"calico-apiserver", SelfLink:"", UID:"7830dd1a-252c-46ff-ba84-ba7feca691a8", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"548cd799bc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"", Pod:"calico-apiserver-548cd799bc-4tqh9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9c98483f6de", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:59.134623 containerd[1684]: 2026-01-23 18:31:59.106 [INFO][4739] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.71/32] ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-4tqh9" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" Jan 23 18:31:59.134623 containerd[1684]: 2026-01-23 18:31:59.106 [INFO][4739] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c98483f6de ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-4tqh9" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" Jan 23 18:31:59.134623 containerd[1684]: 2026-01-23 18:31:59.113 [INFO][4739] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-4tqh9" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" Jan 23 18:31:59.134623 containerd[1684]: 2026-01-23 18:31:59.115 [INFO][4739] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-4tqh9" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0", GenerateName:"calico-apiserver-548cd799bc-", Namespace:"calico-apiserver", SelfLink:"", UID:"7830dd1a-252c-46ff-ba84-ba7feca691a8", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"548cd799bc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33", Pod:"calico-apiserver-548cd799bc-4tqh9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9c98483f6de", MAC:"72:cc:34:7e:e9:76", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:59.134623 containerd[1684]: 2026-01-23 18:31:59.128 [INFO][4739] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" Namespace="calico-apiserver" Pod="calico-apiserver-548cd799bc-4tqh9" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--apiserver--548cd799bc--4tqh9-eth0" Jan 23 18:31:59.134785 kubelet[2852]: E0123 18:31:59.134366 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:31:59.167386 containerd[1684]: time="2026-01-23T18:31:59.167272032Z" level=info msg="connecting to shim ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33" address="unix:///run/containerd/s/c9ac1590f57c20e88f09d96bc169a52357a0d504211dbce2db85fce7172587a6" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:59.183000 audit[4812]: NETFILTER_CFG table=filter:138 family=2 entries=49 op=nft_register_chain pid=4812 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:59.183000 audit[4812]: SYSCALL arch=c000003e syscall=46 success=yes exit=25420 a0=3 a1=7ffc9ac24670 a2=0 a3=7ffc9ac2465c items=0 ppid=4111 pid=4812 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.183000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:59.195193 systemd[1]: Started cri-containerd-ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33.scope - libcontainer container ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33. Jan 23 18:31:59.213000 audit: BPF prog-id=246 op=LOAD Jan 23 18:31:59.214000 audit: BPF prog-id=247 op=LOAD Jan 23 18:31:59.214000 audit[4814]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a238 a2=98 a3=0 items=0 ppid=4802 pid=4814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361303465613634316463313362396462653335386263333039666130 Jan 23 18:31:59.214000 audit: BPF prog-id=247 op=UNLOAD Jan 23 18:31:59.214000 audit[4814]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4802 pid=4814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361303465613634316463313362396462653335386263333039666130 Jan 23 18:31:59.214000 audit: BPF prog-id=248 op=LOAD Jan 23 18:31:59.214000 audit[4814]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a488 a2=98 a3=0 items=0 ppid=4802 pid=4814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361303465613634316463313362396462653335386263333039666130 Jan 23 18:31:59.214000 audit: BPF prog-id=249 op=LOAD Jan 23 18:31:59.214000 audit[4814]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00017a218 a2=98 a3=0 items=0 ppid=4802 pid=4814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361303465613634316463313362396462653335386263333039666130 Jan 23 18:31:59.214000 audit: BPF prog-id=249 op=UNLOAD Jan 23 18:31:59.214000 audit[4814]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4802 pid=4814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361303465613634316463313362396462653335386263333039666130 Jan 23 18:31:59.214000 audit: BPF prog-id=248 op=UNLOAD Jan 23 18:31:59.214000 audit[4814]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4802 pid=4814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361303465613634316463313362396462653335386263333039666130 Jan 23 18:31:59.214000 audit: BPF prog-id=250 op=LOAD Jan 23 18:31:59.214000 audit[4814]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00017a6e8 a2=98 a3=0 items=0 ppid=4802 pid=4814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6361303465613634316463313362396462653335386263333039666130 Jan 23 18:31:59.216675 systemd-networkd[1578]: cali0e144d1ad2c: Link UP Jan 23 18:31:59.217342 systemd-networkd[1578]: cali0e144d1ad2c: Gained carrier Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.049 [INFO][4746] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0 calico-kube-controllers-69d9c854f8- calico-system 4e218aa3-82e2-43da-97a5-0f48de07a97f 844 0 2026-01-23 18:31:30 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:69d9c854f8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-1-0-c-e2d32aff86 calico-kube-controllers-69d9c854f8-4vpsg eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0e144d1ad2c [] [] }} ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Namespace="calico-system" Pod="calico-kube-controllers-69d9c854f8-4vpsg" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.049 [INFO][4746] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Namespace="calico-system" Pod="calico-kube-controllers-69d9c854f8-4vpsg" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.081 [INFO][4770] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" HandleID="k8s-pod-network.0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Workload="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.083 [INFO][4770] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" HandleID="k8s-pod-network.0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Workload="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5630), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-c-e2d32aff86", "pod":"calico-kube-controllers-69d9c854f8-4vpsg", "timestamp":"2026-01-23 18:31:59.081689831 +0000 UTC"}, Hostname:"ci-4547-1-0-c-e2d32aff86", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.083 [INFO][4770] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.102 [INFO][4770] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.102 [INFO][4770] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-c-e2d32aff86' Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.180 [INFO][4770] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.188 [INFO][4770] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.194 [INFO][4770] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.196 [INFO][4770] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.197 [INFO][4770] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.197 [INFO][4770] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.199 [INFO][4770] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0 Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.202 [INFO][4770] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.209 [INFO][4770] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.72/26] block=192.168.100.64/26 handle="k8s-pod-network.0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.209 [INFO][4770] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.72/26] handle="k8s-pod-network.0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" host="ci-4547-1-0-c-e2d32aff86" Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.209 [INFO][4770] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 18:31:59.236771 containerd[1684]: 2026-01-23 18:31:59.209 [INFO][4770] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.72/26] IPv6=[] ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" HandleID="k8s-pod-network.0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Workload="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" Jan 23 18:31:59.237234 containerd[1684]: 2026-01-23 18:31:59.212 [INFO][4746] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Namespace="calico-system" Pod="calico-kube-controllers-69d9c854f8-4vpsg" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0", GenerateName:"calico-kube-controllers-69d9c854f8-", Namespace:"calico-system", SelfLink:"", UID:"4e218aa3-82e2-43da-97a5-0f48de07a97f", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69d9c854f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"", Pod:"calico-kube-controllers-69d9c854f8-4vpsg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.100.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0e144d1ad2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:59.237234 containerd[1684]: 2026-01-23 18:31:59.212 [INFO][4746] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.72/32] ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Namespace="calico-system" Pod="calico-kube-controllers-69d9c854f8-4vpsg" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" Jan 23 18:31:59.237234 containerd[1684]: 2026-01-23 18:31:59.212 [INFO][4746] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0e144d1ad2c ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Namespace="calico-system" Pod="calico-kube-controllers-69d9c854f8-4vpsg" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" Jan 23 18:31:59.237234 containerd[1684]: 2026-01-23 18:31:59.219 [INFO][4746] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Namespace="calico-system" Pod="calico-kube-controllers-69d9c854f8-4vpsg" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" Jan 23 18:31:59.237234 containerd[1684]: 2026-01-23 18:31:59.219 [INFO][4746] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Namespace="calico-system" Pod="calico-kube-controllers-69d9c854f8-4vpsg" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0", GenerateName:"calico-kube-controllers-69d9c854f8-", Namespace:"calico-system", SelfLink:"", UID:"4e218aa3-82e2-43da-97a5-0f48de07a97f", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 18, 31, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69d9c854f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-c-e2d32aff86", ContainerID:"0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0", Pod:"calico-kube-controllers-69d9c854f8-4vpsg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.100.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0e144d1ad2c", MAC:"52:c3:cf:1e:18:69", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 18:31:59.237234 containerd[1684]: 2026-01-23 18:31:59.230 [INFO][4746] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" Namespace="calico-system" Pod="calico-kube-controllers-69d9c854f8-4vpsg" WorkloadEndpoint="ci--4547--1--0--c--e2d32aff86-k8s-calico--kube--controllers--69d9c854f8--4vpsg-eth0" Jan 23 18:31:59.253000 audit[4842]: NETFILTER_CFG table=filter:139 family=2 entries=40 op=nft_register_chain pid=4842 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 18:31:59.253000 audit[4842]: SYSCALL arch=c000003e syscall=46 success=yes exit=20784 a0=3 a1=7ffe57bd8020 a2=0 a3=7ffe57bd800c items=0 ppid=4111 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.253000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 18:31:59.270379 containerd[1684]: time="2026-01-23T18:31:59.270322224Z" level=info msg="connecting to shim 0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0" address="unix:///run/containerd/s/fa5568466c8ecc0bb2564a813fb7c4485e573de7b8a58008d587cb3b1cdeec6d" namespace=k8s.io protocol=ttrpc version=3 Jan 23 18:31:59.270572 containerd[1684]: time="2026-01-23T18:31:59.270339954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-548cd799bc-4tqh9,Uid:7830dd1a-252c-46ff-ba84-ba7feca691a8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ca04ea641dc13b9dbe358bc309fa0d944f1e8e6cf27368dffb084a8cc835ca33\"" Jan 23 18:31:59.272074 containerd[1684]: time="2026-01-23T18:31:59.272054911Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:31:59.297120 systemd[1]: Started cri-containerd-0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0.scope - libcontainer container 0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0. Jan 23 18:31:59.307000 audit: BPF prog-id=251 op=LOAD Jan 23 18:31:59.308000 audit: BPF prog-id=252 op=LOAD Jan 23 18:31:59.308000 audit[4868]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4858 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064323339303865343464366438623631656339366238646464303232 Jan 23 18:31:59.308000 audit: BPF prog-id=252 op=UNLOAD Jan 23 18:31:59.308000 audit[4868]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4858 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064323339303865343464366438623631656339366238646464303232 Jan 23 18:31:59.308000 audit: BPF prog-id=253 op=LOAD Jan 23 18:31:59.308000 audit[4868]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4858 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064323339303865343464366438623631656339366238646464303232 Jan 23 18:31:59.308000 audit: BPF prog-id=254 op=LOAD Jan 23 18:31:59.308000 audit[4868]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4858 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064323339303865343464366438623631656339366238646464303232 Jan 23 18:31:59.308000 audit: BPF prog-id=254 op=UNLOAD Jan 23 18:31:59.308000 audit[4868]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4858 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064323339303865343464366438623631656339366238646464303232 Jan 23 18:31:59.308000 audit: BPF prog-id=253 op=UNLOAD Jan 23 18:31:59.308000 audit[4868]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4858 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064323339303865343464366438623631656339366238646464303232 Jan 23 18:31:59.308000 audit: BPF prog-id=255 op=LOAD Jan 23 18:31:59.308000 audit[4868]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4858 pid=4868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:31:59.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064323339303865343464366438623631656339366238646464303232 Jan 23 18:31:59.338371 containerd[1684]: time="2026-01-23T18:31:59.338344066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69d9c854f8-4vpsg,Uid:4e218aa3-82e2-43da-97a5-0f48de07a97f,Namespace:calico-system,Attempt:0,} returns sandbox id \"0d23908e44d6d8b61ec96b8ddd02280a83848901b9d8c6571a84a4491234dfe0\"" Jan 23 18:31:59.708313 containerd[1684]: time="2026-01-23T18:31:59.708031253Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:31:59.710114 containerd[1684]: time="2026-01-23T18:31:59.709916100Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:31:59.710509 containerd[1684]: time="2026-01-23T18:31:59.710380473Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:31:59.711137 kubelet[2852]: E0123 18:31:59.710903 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:59.711137 kubelet[2852]: E0123 18:31:59.711031 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:31:59.711654 containerd[1684]: time="2026-01-23T18:31:59.711585847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 18:31:59.712656 kubelet[2852]: E0123 18:31:59.712576 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p4nkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-548cd799bc-4tqh9_calico-apiserver(7830dd1a-252c-46ff-ba84-ba7feca691a8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:31:59.714057 kubelet[2852]: E0123 18:31:59.713927 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:32:00.144902 containerd[1684]: time="2026-01-23T18:32:00.144743866Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:00.150455 kubelet[2852]: E0123 18:32:00.150187 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:32:00.154581 containerd[1684]: time="2026-01-23T18:32:00.154483523Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 18:32:00.154803 containerd[1684]: time="2026-01-23T18:32:00.154711264Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:00.155387 kubelet[2852]: E0123 18:32:00.155161 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:00.155387 kubelet[2852]: E0123 18:32:00.155284 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:00.159251 kubelet[2852]: E0123 18:32:00.159178 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwzkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-69d9c854f8-4vpsg_calico-system(4e218aa3-82e2-43da-97a5-0f48de07a97f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:00.160743 kubelet[2852]: E0123 18:32:00.160566 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:32:00.224000 audit[4901]: NETFILTER_CFG table=filter:140 family=2 entries=14 op=nft_register_rule pid=4901 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:32:00.224000 audit[4901]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffc11bd1cf0 a2=0 a3=7ffc11bd1cdc items=0 ppid=2960 pid=4901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:32:00.224000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:32:00.229000 audit[4901]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=4901 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:32:00.229000 audit[4901]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffc11bd1cf0 a2=0 a3=7ffc11bd1cdc items=0 ppid=2960 pid=4901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:32:00.229000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:32:00.396471 systemd-networkd[1578]: cali9c98483f6de: Gained IPv6LL Jan 23 18:32:01.147410 kubelet[2852]: E0123 18:32:01.147355 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:32:01.151172 kubelet[2852]: E0123 18:32:01.151096 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:32:01.164460 systemd-networkd[1578]: cali0e144d1ad2c: Gained IPv6LL Jan 23 18:32:05.915930 containerd[1684]: time="2026-01-23T18:32:05.914606430Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 18:32:06.364678 containerd[1684]: time="2026-01-23T18:32:06.364610524Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:06.366884 containerd[1684]: time="2026-01-23T18:32:06.366501020Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 18:32:06.366884 containerd[1684]: time="2026-01-23T18:32:06.366683340Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:06.367320 kubelet[2852]: E0123 18:32:06.367257 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:32:06.367320 kubelet[2852]: E0123 18:32:06.367322 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:32:06.368730 kubelet[2852]: E0123 18:32:06.368562 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8ad8e04c94d744beb4c48881ae67d8bd,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lzrrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7b5b4545c8-4tbgk_calico-system(e698d847-3c19-47a7-986c-5552a2964f3f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:06.372909 containerd[1684]: time="2026-01-23T18:32:06.372727166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 18:32:06.808642 containerd[1684]: time="2026-01-23T18:32:06.808066510Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:06.809627 containerd[1684]: time="2026-01-23T18:32:06.809546953Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 18:32:06.809882 containerd[1684]: time="2026-01-23T18:32:06.809752163Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:06.810291 kubelet[2852]: E0123 18:32:06.810177 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:32:06.810291 kubelet[2852]: E0123 18:32:06.810243 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:32:06.810431 kubelet[2852]: E0123 18:32:06.810375 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzrrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7b5b4545c8-4tbgk_calico-system(e698d847-3c19-47a7-986c-5552a2964f3f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:06.811886 kubelet[2852]: E0123 18:32:06.811670 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:32:09.912390 containerd[1684]: time="2026-01-23T18:32:09.911830020Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:32:10.405156 containerd[1684]: time="2026-01-23T18:32:10.405042374Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:10.406755 containerd[1684]: time="2026-01-23T18:32:10.406698868Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:32:10.406874 containerd[1684]: time="2026-01-23T18:32:10.406806278Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:10.407130 kubelet[2852]: E0123 18:32:10.407028 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:10.408160 kubelet[2852]: E0123 18:32:10.407132 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:10.408160 kubelet[2852]: E0123 18:32:10.407454 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2whq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-548cd799bc-rm88n_calico-apiserver(8bfceda8-a381-44e3-ab5c-8d6cd2f190e0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:10.408897 containerd[1684]: time="2026-01-23T18:32:10.407670629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 18:32:10.409074 kubelet[2852]: E0123 18:32:10.408790 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:32:10.855795 containerd[1684]: time="2026-01-23T18:32:10.855735805Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:10.857325 containerd[1684]: time="2026-01-23T18:32:10.857191267Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 18:32:10.857325 containerd[1684]: time="2026-01-23T18:32:10.857282218Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:10.857660 kubelet[2852]: E0123 18:32:10.857560 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:32:10.857742 kubelet[2852]: E0123 18:32:10.857652 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:32:10.858009 kubelet[2852]: E0123 18:32:10.857887 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x97s9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-dw7bl_calico-system(8cfde420-1102-4e10-b36c-f5766b852cc7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:10.859354 kubelet[2852]: E0123 18:32:10.859290 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:32:11.914828 containerd[1684]: time="2026-01-23T18:32:11.913199966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:32:12.353027 containerd[1684]: time="2026-01-23T18:32:12.352785428Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:12.354697 containerd[1684]: time="2026-01-23T18:32:12.354631362Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:32:12.354872 containerd[1684]: time="2026-01-23T18:32:12.354738692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:12.355095 kubelet[2852]: E0123 18:32:12.355021 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:12.355654 kubelet[2852]: E0123 18:32:12.355139 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:12.355703 kubelet[2852]: E0123 18:32:12.355601 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p4nkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-548cd799bc-4tqh9_calico-apiserver(7830dd1a-252c-46ff-ba84-ba7feca691a8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:12.357285 kubelet[2852]: E0123 18:32:12.357223 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:32:13.916517 containerd[1684]: time="2026-01-23T18:32:13.916402168Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 18:32:14.363750 containerd[1684]: time="2026-01-23T18:32:14.363650281Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:14.365176 containerd[1684]: time="2026-01-23T18:32:14.365088603Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 18:32:14.365281 containerd[1684]: time="2026-01-23T18:32:14.365223563Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:14.365606 kubelet[2852]: E0123 18:32:14.365524 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:32:14.365606 kubelet[2852]: E0123 18:32:14.365600 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:32:14.366390 kubelet[2852]: E0123 18:32:14.365747 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc26w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:14.368697 containerd[1684]: time="2026-01-23T18:32:14.368616120Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 18:32:14.825104 containerd[1684]: time="2026-01-23T18:32:14.824913900Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:14.827594 containerd[1684]: time="2026-01-23T18:32:14.827413994Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 18:32:14.827594 containerd[1684]: time="2026-01-23T18:32:14.827515214Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:14.827923 kubelet[2852]: E0123 18:32:14.827820 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:32:14.828193 kubelet[2852]: E0123 18:32:14.827928 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:32:14.828746 kubelet[2852]: E0123 18:32:14.828630 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc26w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:14.830468 kubelet[2852]: E0123 18:32:14.830384 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:32:16.912752 containerd[1684]: time="2026-01-23T18:32:16.912649061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 18:32:17.357816 containerd[1684]: time="2026-01-23T18:32:17.357501071Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:17.359714 containerd[1684]: time="2026-01-23T18:32:17.359403824Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 18:32:17.360267 containerd[1684]: time="2026-01-23T18:32:17.359586334Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:17.360862 kubelet[2852]: E0123 18:32:17.360800 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:17.361467 kubelet[2852]: E0123 18:32:17.360875 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:17.361467 kubelet[2852]: E0123 18:32:17.361057 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwzkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-69d9c854f8-4vpsg_calico-system(4e218aa3-82e2-43da-97a5-0f48de07a97f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:17.362655 kubelet[2852]: E0123 18:32:17.362266 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:32:17.918177 kubelet[2852]: E0123 18:32:17.917818 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:32:22.912221 kubelet[2852]: E0123 18:32:22.912128 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:32:23.915189 kubelet[2852]: E0123 18:32:23.914483 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:32:24.912823 kubelet[2852]: E0123 18:32:24.912686 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:32:28.915044 kubelet[2852]: E0123 18:32:28.914886 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:32:29.914661 kubelet[2852]: E0123 18:32:29.914619 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:32:31.915949 containerd[1684]: time="2026-01-23T18:32:31.915576240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 18:32:32.367143 containerd[1684]: time="2026-01-23T18:32:32.367095161Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:32.368562 containerd[1684]: time="2026-01-23T18:32:32.368542546Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 18:32:32.368706 containerd[1684]: time="2026-01-23T18:32:32.368624904Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:32.368914 kubelet[2852]: E0123 18:32:32.368884 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:32:32.369946 kubelet[2852]: E0123 18:32:32.369228 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:32:32.369946 kubelet[2852]: E0123 18:32:32.369316 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8ad8e04c94d744beb4c48881ae67d8bd,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lzrrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7b5b4545c8-4tbgk_calico-system(e698d847-3c19-47a7-986c-5552a2964f3f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:32.372241 containerd[1684]: time="2026-01-23T18:32:32.372199725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 18:32:32.805034 containerd[1684]: time="2026-01-23T18:32:32.804895462Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:32.806561 containerd[1684]: time="2026-01-23T18:32:32.806474926Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 18:32:32.806561 containerd[1684]: time="2026-01-23T18:32:32.806557664Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:32.806778 kubelet[2852]: E0123 18:32:32.806699 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:32:32.806778 kubelet[2852]: E0123 18:32:32.806761 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:32:32.806920 kubelet[2852]: E0123 18:32:32.806874 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzrrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7b5b4545c8-4tbgk_calico-system(e698d847-3c19-47a7-986c-5552a2964f3f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:32.808482 kubelet[2852]: E0123 18:32:32.808293 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:32:34.915264 containerd[1684]: time="2026-01-23T18:32:34.915147191Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 18:32:35.356577 containerd[1684]: time="2026-01-23T18:32:35.356527906Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:35.358293 containerd[1684]: time="2026-01-23T18:32:35.358260989Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 18:32:35.358370 containerd[1684]: time="2026-01-23T18:32:35.358332048Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:35.358690 kubelet[2852]: E0123 18:32:35.358652 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:32:35.358965 kubelet[2852]: E0123 18:32:35.358699 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:32:35.358965 kubelet[2852]: E0123 18:32:35.358859 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x97s9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-dw7bl_calico-system(8cfde420-1102-4e10-b36c-f5766b852cc7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:35.359947 containerd[1684]: time="2026-01-23T18:32:35.359923383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:32:35.360757 kubelet[2852]: E0123 18:32:35.360624 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:32:35.819575 containerd[1684]: time="2026-01-23T18:32:35.819428657Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:35.820648 containerd[1684]: time="2026-01-23T18:32:35.820629008Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:32:35.820745 containerd[1684]: time="2026-01-23T18:32:35.820673387Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:35.821004 kubelet[2852]: E0123 18:32:35.820981 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:35.821542 kubelet[2852]: E0123 18:32:35.821476 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:35.821677 kubelet[2852]: E0123 18:32:35.821652 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2whq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-548cd799bc-rm88n_calico-apiserver(8bfceda8-a381-44e3-ab5c-8d6cd2f190e0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:35.822985 kubelet[2852]: E0123 18:32:35.822887 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:32:38.914021 containerd[1684]: time="2026-01-23T18:32:38.913567166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:32:39.361106 containerd[1684]: time="2026-01-23T18:32:39.361051788Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:39.362628 containerd[1684]: time="2026-01-23T18:32:39.362544807Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:32:39.364044 containerd[1684]: time="2026-01-23T18:32:39.364017117Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:39.364376 kubelet[2852]: E0123 18:32:39.364328 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:39.364723 kubelet[2852]: E0123 18:32:39.364391 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:32:39.364723 kubelet[2852]: E0123 18:32:39.364511 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p4nkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-548cd799bc-4tqh9_calico-apiserver(7830dd1a-252c-46ff-ba84-ba7feca691a8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:39.366026 kubelet[2852]: E0123 18:32:39.365996 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:32:39.915520 containerd[1684]: time="2026-01-23T18:32:39.915425741Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 18:32:40.373352 containerd[1684]: time="2026-01-23T18:32:40.373300052Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:40.376015 containerd[1684]: time="2026-01-23T18:32:40.374586325Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 18:32:40.376015 containerd[1684]: time="2026-01-23T18:32:40.374630735Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:40.376203 kubelet[2852]: E0123 18:32:40.374709 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:40.376203 kubelet[2852]: E0123 18:32:40.374759 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:32:40.376203 kubelet[2852]: E0123 18:32:40.374893 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwzkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-69d9c854f8-4vpsg_calico-system(4e218aa3-82e2-43da-97a5-0f48de07a97f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:40.377635 kubelet[2852]: E0123 18:32:40.377062 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:32:43.919681 containerd[1684]: time="2026-01-23T18:32:43.919296278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 18:32:44.346934 containerd[1684]: time="2026-01-23T18:32:44.346892103Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:44.348168 containerd[1684]: time="2026-01-23T18:32:44.348131329Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 18:32:44.348168 containerd[1684]: time="2026-01-23T18:32:44.348149208Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:44.348375 kubelet[2852]: E0123 18:32:44.348342 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:32:44.348651 kubelet[2852]: E0123 18:32:44.348387 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:32:44.349372 kubelet[2852]: E0123 18:32:44.349101 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc26w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:44.351721 containerd[1684]: time="2026-01-23T18:32:44.351672756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 18:32:44.798119 containerd[1684]: time="2026-01-23T18:32:44.797197593Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:32:44.800171 containerd[1684]: time="2026-01-23T18:32:44.800005789Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 18:32:44.800171 containerd[1684]: time="2026-01-23T18:32:44.800124937Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 18:32:44.800535 kubelet[2852]: E0123 18:32:44.800446 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:32:44.800717 kubelet[2852]: E0123 18:32:44.800693 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:32:44.800943 kubelet[2852]: E0123 18:32:44.800895 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc26w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 18:32:44.802764 kubelet[2852]: E0123 18:32:44.802693 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:32:47.915230 kubelet[2852]: E0123 18:32:47.914532 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:32:47.916738 kubelet[2852]: E0123 18:32:47.916470 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:32:50.915629 kubelet[2852]: E0123 18:32:50.914909 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:32:51.917995 kubelet[2852]: E0123 18:32:51.916855 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:32:52.406000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-46.62.169.9:22-4.153.228.146:38906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:32:52.408956 kernel: kauditd_printk_skb: 83 callbacks suppressed Jan 23 18:32:52.409144 kernel: audit: type=1130 audit(1769193172.406:757): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-46.62.169.9:22-4.153.228.146:38906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:32:52.407616 systemd[1]: Started sshd@11-46.62.169.9:22-4.153.228.146:38906.service - OpenSSH per-connection server daemon (4.153.228.146:38906). Jan 23 18:32:52.913011 kubelet[2852]: E0123 18:32:52.912835 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:32:53.108707 sshd[4972]: Accepted publickey for core from 4.153.228.146 port 38906 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:32:53.107000 audit[4972]: USER_ACCT pid=4972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.112833 sshd-session[4972]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:32:53.115023 kernel: audit: type=1101 audit(1769193173.107:758): pid=4972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.110000 audit[4972]: CRED_ACQ pid=4972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.122021 kernel: audit: type=1103 audit(1769193173.110:759): pid=4972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.126023 systemd-logind[1647]: New session 9 of user core. Jan 23 18:32:53.128432 kernel: audit: type=1006 audit(1769193173.110:760): pid=4972 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Jan 23 18:32:53.128113 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 23 18:32:53.110000 audit[4972]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffddc9a7800 a2=3 a3=0 items=0 ppid=1 pid=4972 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:32:53.136015 kernel: audit: type=1300 audit(1769193173.110:760): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffddc9a7800 a2=3 a3=0 items=0 ppid=1 pid=4972 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:32:53.110000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:32:53.140994 kernel: audit: type=1327 audit(1769193173.110:760): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:32:53.136000 audit[4972]: USER_START pid=4972 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.149994 kernel: audit: type=1105 audit(1769193173.136:761): pid=4972 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.139000 audit[4994]: CRED_ACQ pid=4994 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.157995 kernel: audit: type=1103 audit(1769193173.139:762): pid=4994 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.580697 sshd[4994]: Connection closed by 4.153.228.146 port 38906 Jan 23 18:32:53.581670 sshd-session[4972]: pam_unix(sshd:session): session closed for user core Jan 23 18:32:53.584000 audit[4972]: USER_END pid=4972 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.595053 systemd[1]: sshd@11-46.62.169.9:22-4.153.228.146:38906.service: Deactivated successfully. Jan 23 18:32:53.601122 kernel: audit: type=1106 audit(1769193173.584:763): pid=4972 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.602044 kernel: audit: type=1104 audit(1769193173.584:764): pid=4972 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.584000 audit[4972]: CRED_DISP pid=4972 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:53.603798 systemd[1]: session-9.scope: Deactivated successfully. Jan 23 18:32:53.606922 systemd-logind[1647]: Session 9 logged out. Waiting for processes to exit. Jan 23 18:32:53.616217 systemd-logind[1647]: Removed session 9. Jan 23 18:32:53.593000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-46.62.169.9:22-4.153.228.146:38906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:32:55.918029 kubelet[2852]: E0123 18:32:55.917929 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:32:58.720267 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:32:58.720401 kernel: audit: type=1130 audit(1769193178.717:766): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-46.62.169.9:22-4.153.228.146:35218 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:32:58.717000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-46.62.169.9:22-4.153.228.146:35218 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:32:58.718340 systemd[1]: Started sshd@12-46.62.169.9:22-4.153.228.146:35218.service - OpenSSH per-connection server daemon (4.153.228.146:35218). Jan 23 18:32:59.398091 sshd[5012]: Accepted publickey for core from 4.153.228.146 port 35218 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:32:59.396000 audit[5012]: USER_ACCT pid=5012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.406445 kernel: audit: type=1101 audit(1769193179.396:767): pid=5012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.407658 sshd-session[5012]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:32:59.405000 audit[5012]: CRED_ACQ pid=5012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.417083 kernel: audit: type=1103 audit(1769193179.405:768): pid=5012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.421555 systemd-logind[1647]: New session 10 of user core. Jan 23 18:32:59.429009 kernel: audit: type=1006 audit(1769193179.405:769): pid=5012 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 23 18:32:59.429241 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 23 18:32:59.405000 audit[5012]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffd8b04110 a2=3 a3=0 items=0 ppid=1 pid=5012 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:32:59.439110 kernel: audit: type=1300 audit(1769193179.405:769): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fffd8b04110 a2=3 a3=0 items=0 ppid=1 pid=5012 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:32:59.405000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:32:59.448004 kernel: audit: type=1327 audit(1769193179.405:769): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:32:59.439000 audit[5012]: USER_START pid=5012 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.458006 kernel: audit: type=1105 audit(1769193179.439:770): pid=5012 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.441000 audit[5016]: CRED_ACQ pid=5016 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.464993 kernel: audit: type=1103 audit(1769193179.441:771): pid=5016 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.869914 sshd[5016]: Connection closed by 4.153.228.146 port 35218 Jan 23 18:32:59.873105 sshd-session[5012]: pam_unix(sshd:session): session closed for user core Jan 23 18:32:59.876000 audit[5012]: USER_END pid=5012 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.884607 systemd[1]: sshd@12-46.62.169.9:22-4.153.228.146:35218.service: Deactivated successfully. Jan 23 18:32:59.886918 systemd[1]: session-10.scope: Deactivated successfully. Jan 23 18:32:59.890313 systemd-logind[1647]: Session 10 logged out. Waiting for processes to exit. Jan 23 18:32:59.891247 systemd-logind[1647]: Removed session 10. Jan 23 18:32:59.876000 audit[5012]: CRED_DISP pid=5012 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.898001 kernel: audit: type=1106 audit(1769193179.876:772): pid=5012 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.898152 kernel: audit: type=1104 audit(1769193179.876:773): pid=5012 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:32:59.881000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-46.62.169.9:22-4.153.228.146:35218 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:00.914306 kubelet[2852]: E0123 18:33:00.914176 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:33:00.915657 kubelet[2852]: E0123 18:33:00.914654 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:33:04.912603 kubelet[2852]: E0123 18:33:04.912555 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:33:05.007000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-46.62.169.9:22-4.153.228.146:60562 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:05.008666 systemd[1]: Started sshd@13-46.62.169.9:22-4.153.228.146:60562.service - OpenSSH per-connection server daemon (4.153.228.146:60562). Jan 23 18:33:05.010641 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:05.010703 kernel: audit: type=1130 audit(1769193185.007:775): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-46.62.169.9:22-4.153.228.146:60562 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:05.687000 audit[5029]: USER_ACCT pid=5029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:05.703209 kernel: audit: type=1101 audit(1769193185.687:776): pid=5029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:05.706110 sshd[5029]: Accepted publickey for core from 4.153.228.146 port 60562 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:05.704000 audit[5029]: CRED_ACQ pid=5029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:05.707297 sshd-session[5029]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:05.720459 kernel: audit: type=1103 audit(1769193185.704:777): pid=5029 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:05.704000 audit[5029]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe471fcce0 a2=3 a3=0 items=0 ppid=1 pid=5029 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:05.733316 kernel: audit: type=1006 audit(1769193185.704:778): pid=5029 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 23 18:33:05.733386 kernel: audit: type=1300 audit(1769193185.704:778): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe471fcce0 a2=3 a3=0 items=0 ppid=1 pid=5029 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:05.704000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:05.739603 kernel: audit: type=1327 audit(1769193185.704:778): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:05.742160 systemd-logind[1647]: New session 11 of user core. Jan 23 18:33:05.747154 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 23 18:33:05.750000 audit[5029]: USER_START pid=5029 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:05.750000 audit[5033]: CRED_ACQ pid=5033 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:05.761024 kernel: audit: type=1105 audit(1769193185.750:779): pid=5029 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:05.761105 kernel: audit: type=1103 audit(1769193185.750:780): pid=5033 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:05.912915 kubelet[2852]: E0123 18:33:05.912376 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:33:06.118305 sshd[5033]: Connection closed by 4.153.228.146 port 60562 Jan 23 18:33:06.118822 sshd-session[5029]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:06.119000 audit[5029]: USER_END pid=5029 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:06.124480 systemd[1]: sshd@13-46.62.169.9:22-4.153.228.146:60562.service: Deactivated successfully. Jan 23 18:33:06.129088 systemd[1]: session-11.scope: Deactivated successfully. Jan 23 18:33:06.132186 systemd-logind[1647]: Session 11 logged out. Waiting for processes to exit. Jan 23 18:33:06.133707 systemd-logind[1647]: Removed session 11. Jan 23 18:33:06.135001 kernel: audit: type=1106 audit(1769193186.119:781): pid=5029 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:06.119000 audit[5029]: CRED_DISP pid=5029 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:06.125000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-46.62.169.9:22-4.153.228.146:60562 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:06.146996 kernel: audit: type=1104 audit(1769193186.119:782): pid=5029 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:06.251000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-46.62.169.9:22-4.153.228.146:60568 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:06.252189 systemd[1]: Started sshd@14-46.62.169.9:22-4.153.228.146:60568.service - OpenSSH per-connection server daemon (4.153.228.146:60568). Jan 23 18:33:06.906000 audit[5046]: USER_ACCT pid=5046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:06.907910 sshd[5046]: Accepted publickey for core from 4.153.228.146 port 60568 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:06.908000 audit[5046]: CRED_ACQ pid=5046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:06.908000 audit[5046]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe3f123580 a2=3 a3=0 items=0 ppid=1 pid=5046 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:06.908000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:06.917051 kubelet[2852]: E0123 18:33:06.915276 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:33:06.918823 sshd-session[5046]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:06.936443 systemd-logind[1647]: New session 12 of user core. Jan 23 18:33:06.942419 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 23 18:33:06.956000 audit[5046]: USER_START pid=5046 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:06.962000 audit[5050]: CRED_ACQ pid=5050 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:07.413120 sshd[5050]: Connection closed by 4.153.228.146 port 60568 Jan 23 18:33:07.413454 sshd-session[5046]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:07.416000 audit[5046]: USER_END pid=5046 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:07.416000 audit[5046]: CRED_DISP pid=5046 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:07.420000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-46.62.169.9:22-4.153.228.146:60568 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:07.421669 systemd[1]: sshd@14-46.62.169.9:22-4.153.228.146:60568.service: Deactivated successfully. Jan 23 18:33:07.424595 systemd[1]: session-12.scope: Deactivated successfully. Jan 23 18:33:07.427734 systemd-logind[1647]: Session 12 logged out. Waiting for processes to exit. Jan 23 18:33:07.430747 systemd-logind[1647]: Removed session 12. Jan 23 18:33:07.555000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-46.62.169.9:22-4.153.228.146:60578 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:07.556364 systemd[1]: Started sshd@15-46.62.169.9:22-4.153.228.146:60578.service - OpenSSH per-connection server daemon (4.153.228.146:60578). Jan 23 18:33:08.245000 audit[5060]: USER_ACCT pid=5060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:08.247039 sshd[5060]: Accepted publickey for core from 4.153.228.146 port 60578 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:08.248000 audit[5060]: CRED_ACQ pid=5060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:08.248000 audit[5060]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc129c7fd0 a2=3 a3=0 items=0 ppid=1 pid=5060 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:08.248000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:08.251255 sshd-session[5060]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:08.262084 systemd-logind[1647]: New session 13 of user core. Jan 23 18:33:08.269262 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 23 18:33:08.274000 audit[5060]: USER_START pid=5060 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:08.278000 audit[5068]: CRED_ACQ pid=5068 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:08.713837 sshd[5068]: Connection closed by 4.153.228.146 port 60578 Jan 23 18:33:08.715470 sshd-session[5060]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:08.721000 audit[5060]: USER_END pid=5060 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:08.721000 audit[5060]: CRED_DISP pid=5060 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:08.728644 systemd-logind[1647]: Session 13 logged out. Waiting for processes to exit. Jan 23 18:33:08.729246 systemd[1]: sshd@15-46.62.169.9:22-4.153.228.146:60578.service: Deactivated successfully. Jan 23 18:33:08.729000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-46.62.169.9:22-4.153.228.146:60578 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:08.735413 systemd[1]: session-13.scope: Deactivated successfully. Jan 23 18:33:08.743668 systemd-logind[1647]: Removed session 13. Jan 23 18:33:10.911577 kubelet[2852]: E0123 18:33:10.911510 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:33:13.851521 systemd[1]: Started sshd@16-46.62.169.9:22-4.153.228.146:60592.service - OpenSSH per-connection server daemon (4.153.228.146:60592). Jan 23 18:33:13.858808 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 23 18:33:13.858896 kernel: audit: type=1130 audit(1769193193.850:802): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-46.62.169.9:22-4.153.228.146:60592 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:13.850000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-46.62.169.9:22-4.153.228.146:60592 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:13.912543 kubelet[2852]: E0123 18:33:13.912498 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:33:14.523000 audit[5081]: USER_ACCT pid=5081 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:14.526269 sshd[5081]: Accepted publickey for core from 4.153.228.146 port 60592 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:14.531512 sshd-session[5081]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:14.540098 kernel: audit: type=1101 audit(1769193194.523:803): pid=5081 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:14.527000 audit[5081]: CRED_ACQ pid=5081 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:14.552864 systemd-logind[1647]: New session 14 of user core. Jan 23 18:33:14.559173 kernel: audit: type=1103 audit(1769193194.527:804): pid=5081 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:14.562333 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 23 18:33:14.578649 kernel: audit: type=1006 audit(1769193194.527:805): pid=5081 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 23 18:33:14.578751 kernel: audit: type=1300 audit(1769193194.527:805): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee6807d10 a2=3 a3=0 items=0 ppid=1 pid=5081 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:14.527000 audit[5081]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee6807d10 a2=3 a3=0 items=0 ppid=1 pid=5081 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:14.527000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:14.581000 audit[5081]: USER_START pid=5081 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:14.587351 kernel: audit: type=1327 audit(1769193194.527:805): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:14.587419 kernel: audit: type=1105 audit(1769193194.581:806): pid=5081 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:14.591000 audit[5091]: CRED_ACQ pid=5091 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:14.593945 kernel: audit: type=1103 audit(1769193194.591:807): pid=5091 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:14.912935 containerd[1684]: time="2026-01-23T18:33:14.912857007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 18:33:15.011902 sshd[5091]: Connection closed by 4.153.228.146 port 60592 Jan 23 18:33:15.012823 sshd-session[5081]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:15.016000 audit[5081]: USER_END pid=5081 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:15.024161 systemd-logind[1647]: Session 14 logged out. Waiting for processes to exit. Jan 23 18:33:15.024691 systemd[1]: sshd@16-46.62.169.9:22-4.153.228.146:60592.service: Deactivated successfully. Jan 23 18:33:15.030339 systemd[1]: session-14.scope: Deactivated successfully. Jan 23 18:33:15.034079 kernel: audit: type=1106 audit(1769193195.016:808): pid=5081 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:15.039222 systemd-logind[1647]: Removed session 14. Jan 23 18:33:15.017000 audit[5081]: CRED_DISP pid=5081 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:15.053210 kernel: audit: type=1104 audit(1769193195.017:809): pid=5081 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:15.024000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-46.62.169.9:22-4.153.228.146:60592 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:15.149000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-46.62.169.9:22-4.153.228.146:49372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:15.150253 systemd[1]: Started sshd@17-46.62.169.9:22-4.153.228.146:49372.service - OpenSSH per-connection server daemon (4.153.228.146:49372). Jan 23 18:33:15.383217 containerd[1684]: time="2026-01-23T18:33:15.383141630Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:33:15.385351 containerd[1684]: time="2026-01-23T18:33:15.385171718Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 18:33:15.385775 containerd[1684]: time="2026-01-23T18:33:15.385228438Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 18:33:15.386032 kubelet[2852]: E0123 18:33:15.385978 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:33:15.386032 kubelet[2852]: E0123 18:33:15.386027 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 18:33:15.387663 kubelet[2852]: E0123 18:33:15.387209 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8ad8e04c94d744beb4c48881ae67d8bd,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lzrrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7b5b4545c8-4tbgk_calico-system(e698d847-3c19-47a7-986c-5552a2964f3f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 18:33:15.389507 containerd[1684]: time="2026-01-23T18:33:15.389346955Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 18:33:15.817000 audit[5102]: USER_ACCT pid=5102 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:15.818779 sshd[5102]: Accepted publickey for core from 4.153.228.146 port 49372 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:15.818000 audit[5102]: CRED_ACQ pid=5102 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:15.818000 audit[5102]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffcbfaead70 a2=3 a3=0 items=0 ppid=1 pid=5102 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:15.818000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:15.820758 containerd[1684]: time="2026-01-23T18:33:15.820626468Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:33:15.820920 sshd-session[5102]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:15.822253 containerd[1684]: time="2026-01-23T18:33:15.822187859Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 18:33:15.823220 containerd[1684]: time="2026-01-23T18:33:15.823122534Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 18:33:15.823504 kubelet[2852]: E0123 18:33:15.823393 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:33:15.823504 kubelet[2852]: E0123 18:33:15.823434 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 18:33:15.824533 kubelet[2852]: E0123 18:33:15.824505 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lzrrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7b5b4545c8-4tbgk_calico-system(e698d847-3c19-47a7-986c-5552a2964f3f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 18:33:15.826287 kubelet[2852]: E0123 18:33:15.826143 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:33:15.828686 systemd-logind[1647]: New session 15 of user core. Jan 23 18:33:15.834114 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 23 18:33:15.835000 audit[5102]: USER_START pid=5102 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:15.837000 audit[5106]: CRED_ACQ pid=5106 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:16.538560 sshd[5106]: Connection closed by 4.153.228.146 port 49372 Jan 23 18:33:16.536874 sshd-session[5102]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:16.540000 audit[5102]: USER_END pid=5102 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:16.541000 audit[5102]: CRED_DISP pid=5102 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:16.551752 systemd[1]: sshd@17-46.62.169.9:22-4.153.228.146:49372.service: Deactivated successfully. Jan 23 18:33:16.551000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-46.62.169.9:22-4.153.228.146:49372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:16.556712 systemd[1]: session-15.scope: Deactivated successfully. Jan 23 18:33:16.561238 systemd-logind[1647]: Session 15 logged out. Waiting for processes to exit. Jan 23 18:33:16.563681 systemd-logind[1647]: Removed session 15. Jan 23 18:33:16.675559 systemd[1]: Started sshd@18-46.62.169.9:22-4.153.228.146:49388.service - OpenSSH per-connection server daemon (4.153.228.146:49388). Jan 23 18:33:16.675000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-46.62.169.9:22-4.153.228.146:49388 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:17.358000 audit[5116]: USER_ACCT pid=5116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:17.359640 sshd[5116]: Accepted publickey for core from 4.153.228.146 port 49388 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:17.360000 audit[5116]: CRED_ACQ pid=5116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:17.360000 audit[5116]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe171c5be0 a2=3 a3=0 items=0 ppid=1 pid=5116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:17.360000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:17.362930 sshd-session[5116]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:17.368717 systemd-logind[1647]: New session 16 of user core. Jan 23 18:33:17.374287 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 23 18:33:17.377000 audit[5116]: USER_START pid=5116 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:17.378000 audit[5120]: CRED_ACQ pid=5120 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:18.497000 audit[5132]: NETFILTER_CFG table=filter:142 family=2 entries=26 op=nft_register_rule pid=5132 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:18.497000 audit[5132]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffc09e8bbc0 a2=0 a3=7ffc09e8bbac items=0 ppid=2960 pid=5132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:18.497000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:18.506000 audit[5132]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5132 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:18.506000 audit[5132]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffc09e8bbc0 a2=0 a3=0 items=0 ppid=2960 pid=5132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:18.506000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:18.529000 audit[5134]: NETFILTER_CFG table=filter:144 family=2 entries=38 op=nft_register_rule pid=5134 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:18.529000 audit[5134]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fffe6ceb570 a2=0 a3=7fffe6ceb55c items=0 ppid=2960 pid=5134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:18.529000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:18.534000 audit[5134]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5134 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:18.534000 audit[5134]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fffe6ceb570 a2=0 a3=0 items=0 ppid=2960 pid=5134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:18.534000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:18.610627 sshd[5120]: Connection closed by 4.153.228.146 port 49388 Jan 23 18:33:18.612189 sshd-session[5116]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:18.616000 audit[5116]: USER_END pid=5116 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:18.616000 audit[5116]: CRED_DISP pid=5116 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:18.622423 systemd[1]: sshd@18-46.62.169.9:22-4.153.228.146:49388.service: Deactivated successfully. Jan 23 18:33:18.622000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-46.62.169.9:22-4.153.228.146:49388 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:18.628526 systemd[1]: session-16.scope: Deactivated successfully. Jan 23 18:33:18.633225 systemd-logind[1647]: Session 16 logged out. Waiting for processes to exit. Jan 23 18:33:18.637631 systemd-logind[1647]: Removed session 16. Jan 23 18:33:18.743603 systemd[1]: Started sshd@19-46.62.169.9:22-4.153.228.146:49398.service - OpenSSH per-connection server daemon (4.153.228.146:49398). Jan 23 18:33:18.743000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-46.62.169.9:22-4.153.228.146:49398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:18.912308 containerd[1684]: time="2026-01-23T18:33:18.912153963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 18:33:19.361849 containerd[1684]: time="2026-01-23T18:33:19.361774679Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:33:19.363642 containerd[1684]: time="2026-01-23T18:33:19.363591049Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 18:33:19.363836 containerd[1684]: time="2026-01-23T18:33:19.363679739Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 18:33:19.364139 kubelet[2852]: E0123 18:33:19.363926 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:33:19.364139 kubelet[2852]: E0123 18:33:19.364118 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 18:33:19.366303 kubelet[2852]: E0123 18:33:19.364309 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x97s9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-dw7bl_calico-system(8cfde420-1102-4e10-b36c-f5766b852cc7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 18:33:19.366303 kubelet[2852]: E0123 18:33:19.365949 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:33:19.398281 kernel: kauditd_printk_skb: 36 callbacks suppressed Jan 23 18:33:19.398439 kernel: audit: type=1101 audit(1769193199.391:834): pid=5139 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:19.391000 audit[5139]: USER_ACCT pid=5139 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:19.397790 sshd-session[5139]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:19.399139 sshd[5139]: Accepted publickey for core from 4.153.228.146 port 49398 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:19.413057 kernel: audit: type=1103 audit(1769193199.393:835): pid=5139 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:19.393000 audit[5139]: CRED_ACQ pid=5139 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:19.436065 kernel: audit: type=1006 audit(1769193199.394:836): pid=5139 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 23 18:33:19.434956 systemd-logind[1647]: New session 17 of user core. Jan 23 18:33:19.394000 audit[5139]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee108da80 a2=3 a3=0 items=0 ppid=1 pid=5139 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:19.394000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:19.458031 kernel: audit: type=1300 audit(1769193199.394:836): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee108da80 a2=3 a3=0 items=0 ppid=1 pid=5139 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:19.458158 kernel: audit: type=1327 audit(1769193199.394:836): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:19.453285 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 23 18:33:19.464000 audit[5139]: USER_START pid=5139 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:19.483192 kernel: audit: type=1105 audit(1769193199.464:837): pid=5139 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:19.481000 audit[5145]: CRED_ACQ pid=5145 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:19.497038 kernel: audit: type=1103 audit(1769193199.481:838): pid=5145 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:20.016186 sshd[5145]: Connection closed by 4.153.228.146 port 49398 Jan 23 18:33:20.020641 sshd-session[5139]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:20.046037 kernel: audit: type=1106 audit(1769193200.026:839): pid=5139 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:20.026000 audit[5139]: USER_END pid=5139 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:20.040435 systemd[1]: sshd@19-46.62.169.9:22-4.153.228.146:49398.service: Deactivated successfully. Jan 23 18:33:20.044642 systemd[1]: session-17.scope: Deactivated successfully. Jan 23 18:33:20.050022 systemd-logind[1647]: Session 17 logged out. Waiting for processes to exit. Jan 23 18:33:20.026000 audit[5139]: CRED_DISP pid=5139 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:20.055037 systemd-logind[1647]: Removed session 17. Jan 23 18:33:20.039000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-46.62.169.9:22-4.153.228.146:49398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:20.067227 kernel: audit: type=1104 audit(1769193200.026:840): pid=5139 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:20.067282 kernel: audit: type=1131 audit(1769193200.039:841): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-46.62.169.9:22-4.153.228.146:49398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:20.146335 systemd[1]: Started sshd@20-46.62.169.9:22-4.153.228.146:49408.service - OpenSSH per-connection server daemon (4.153.228.146:49408). Jan 23 18:33:20.145000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-46.62.169.9:22-4.153.228.146:49408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:20.795000 audit[5155]: USER_ACCT pid=5155 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:20.796530 sshd[5155]: Accepted publickey for core from 4.153.228.146 port 49408 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:20.797000 audit[5155]: CRED_ACQ pid=5155 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:20.797000 audit[5155]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea0019e50 a2=3 a3=0 items=0 ppid=1 pid=5155 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:20.797000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:20.800688 sshd-session[5155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:20.811542 systemd-logind[1647]: New session 18 of user core. Jan 23 18:33:20.816293 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 23 18:33:20.825000 audit[5155]: USER_START pid=5155 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:20.830000 audit[5159]: CRED_ACQ pid=5159 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:20.914453 containerd[1684]: time="2026-01-23T18:33:20.914328062Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:33:21.290312 sshd[5159]: Connection closed by 4.153.228.146 port 49408 Jan 23 18:33:21.291211 sshd-session[5155]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:21.292000 audit[5155]: USER_END pid=5155 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:21.292000 audit[5155]: CRED_DISP pid=5155 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:21.299814 systemd[1]: sshd@20-46.62.169.9:22-4.153.228.146:49408.service: Deactivated successfully. Jan 23 18:33:21.299000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-46.62.169.9:22-4.153.228.146:49408 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:21.304322 systemd[1]: session-18.scope: Deactivated successfully. Jan 23 18:33:21.307603 systemd-logind[1647]: Session 18 logged out. Waiting for processes to exit. Jan 23 18:33:21.310300 systemd-logind[1647]: Removed session 18. Jan 23 18:33:21.349228 containerd[1684]: time="2026-01-23T18:33:21.349122525Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:33:21.350787 containerd[1684]: time="2026-01-23T18:33:21.350647798Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:33:21.350787 containerd[1684]: time="2026-01-23T18:33:21.350765317Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:33:21.351162 kubelet[2852]: E0123 18:33:21.350960 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:33:21.351162 kubelet[2852]: E0123 18:33:21.351057 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:33:21.353283 kubelet[2852]: E0123 18:33:21.351326 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p4nkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-548cd799bc-4tqh9_calico-apiserver(7830dd1a-252c-46ff-ba84-ba7feca691a8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:33:21.353283 kubelet[2852]: E0123 18:33:21.352936 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:33:21.353617 containerd[1684]: time="2026-01-23T18:33:21.352242540Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 18:33:21.793725 containerd[1684]: time="2026-01-23T18:33:21.793620830Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:33:21.795416 containerd[1684]: time="2026-01-23T18:33:21.795362450Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 18:33:21.795514 containerd[1684]: time="2026-01-23T18:33:21.795471240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 18:33:21.796111 kubelet[2852]: E0123 18:33:21.796013 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:33:21.796294 kubelet[2852]: E0123 18:33:21.796233 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 18:33:21.797648 kubelet[2852]: E0123 18:33:21.797524 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwzkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-69d9c854f8-4vpsg_calico-system(4e218aa3-82e2-43da-97a5-0f48de07a97f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 18:33:21.799095 kubelet[2852]: E0123 18:33:21.799014 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:33:21.916063 kubelet[2852]: E0123 18:33:21.915854 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:33:23.241000 audit[5195]: NETFILTER_CFG table=filter:146 family=2 entries=26 op=nft_register_rule pid=5195 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:23.241000 audit[5195]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffcce34aa40 a2=0 a3=7ffcce34aa2c items=0 ppid=2960 pid=5195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:23.241000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:23.247000 audit[5195]: NETFILTER_CFG table=nat:147 family=2 entries=104 op=nft_register_chain pid=5195 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 18:33:23.247000 audit[5195]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffcce34aa40 a2=0 a3=7ffcce34aa2c items=0 ppid=2960 pid=5195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:23.247000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 18:33:26.424220 systemd[1]: Started sshd@21-46.62.169.9:22-4.153.228.146:41904.service - OpenSSH per-connection server daemon (4.153.228.146:41904). Jan 23 18:33:26.431828 kernel: kauditd_printk_skb: 17 callbacks suppressed Jan 23 18:33:26.431875 kernel: audit: type=1130 audit(1769193206.423:853): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-46.62.169.9:22-4.153.228.146:41904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:26.423000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-46.62.169.9:22-4.153.228.146:41904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:26.911608 containerd[1684]: time="2026-01-23T18:33:26.911161354Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 18:33:27.076000 audit[5200]: USER_ACCT pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.078630 sshd[5200]: Accepted publickey for core from 4.153.228.146 port 41904 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:27.083740 sshd-session[5200]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:27.102053 kernel: audit: type=1101 audit(1769193207.076:854): pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.102182 kernel: audit: type=1103 audit(1769193207.079:855): pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.079000 audit[5200]: CRED_ACQ pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.097286 systemd-logind[1647]: New session 19 of user core. Jan 23 18:33:27.106539 kernel: audit: type=1006 audit(1769193207.080:856): pid=5200 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 23 18:33:27.124027 kernel: audit: type=1300 audit(1769193207.080:856): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeb8d60280 a2=3 a3=0 items=0 ppid=1 pid=5200 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:27.080000 audit[5200]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeb8d60280 a2=3 a3=0 items=0 ppid=1 pid=5200 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:27.116225 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 23 18:33:27.129318 kernel: audit: type=1327 audit(1769193207.080:856): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:27.080000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:27.128000 audit[5200]: USER_START pid=5200 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.133000 audit[5204]: CRED_ACQ pid=5204 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.144567 kernel: audit: type=1105 audit(1769193207.128:857): pid=5200 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.144640 kernel: audit: type=1103 audit(1769193207.133:858): pid=5204 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.357642 containerd[1684]: time="2026-01-23T18:33:27.357402196Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:33:27.358951 containerd[1684]: time="2026-01-23T18:33:27.358865799Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 18:33:27.359198 containerd[1684]: time="2026-01-23T18:33:27.358959089Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 18:33:27.359882 kubelet[2852]: E0123 18:33:27.359792 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:33:27.359882 kubelet[2852]: E0123 18:33:27.359876 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 18:33:27.364012 kubelet[2852]: E0123 18:33:27.362110 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2whq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-548cd799bc-rm88n_calico-apiserver(8bfceda8-a381-44e3-ab5c-8d6cd2f190e0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 18:33:27.364012 kubelet[2852]: E0123 18:33:27.363706 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:33:27.563219 sshd[5204]: Connection closed by 4.153.228.146 port 41904 Jan 23 18:33:27.565156 sshd-session[5200]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:27.587441 kernel: audit: type=1106 audit(1769193207.568:859): pid=5200 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.568000 audit[5200]: USER_END pid=5200 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.575135 systemd[1]: sshd@21-46.62.169.9:22-4.153.228.146:41904.service: Deactivated successfully. Jan 23 18:33:27.575562 systemd-logind[1647]: Session 19 logged out. Waiting for processes to exit. Jan 23 18:33:27.580881 systemd[1]: session-19.scope: Deactivated successfully. Jan 23 18:33:27.588581 systemd-logind[1647]: Removed session 19. Jan 23 18:33:27.568000 audit[5200]: CRED_DISP pid=5200 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.573000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-46.62.169.9:22-4.153.228.146:41904 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:27.603129 kernel: audit: type=1104 audit(1769193207.568:860): pid=5200 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:27.912876 kubelet[2852]: E0123 18:33:27.912724 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:33:31.913794 kubelet[2852]: E0123 18:33:31.913617 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:33:32.695846 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:32.695931 kernel: audit: type=1130 audit(1769193212.693:862): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-46.62.169.9:22-4.153.228.146:41906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:32.693000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-46.62.169.9:22-4.153.228.146:41906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:32.694399 systemd[1]: Started sshd@22-46.62.169.9:22-4.153.228.146:41906.service - OpenSSH per-connection server daemon (4.153.228.146:41906). Jan 23 18:33:32.913229 kubelet[2852]: E0123 18:33:32.913099 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:33:32.915023 containerd[1684]: time="2026-01-23T18:33:32.914878416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 18:33:33.343633 containerd[1684]: time="2026-01-23T18:33:33.343587202Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:33:33.344897 containerd[1684]: time="2026-01-23T18:33:33.344861897Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 18:33:33.345002 containerd[1684]: time="2026-01-23T18:33:33.344911076Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 18:33:33.345144 kubelet[2852]: E0123 18:33:33.345104 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:33:33.346247 kubelet[2852]: E0123 18:33:33.345154 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 18:33:33.346247 kubelet[2852]: E0123 18:33:33.345236 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc26w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 18:33:33.347695 containerd[1684]: time="2026-01-23T18:33:33.347671246Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 18:33:33.375037 sshd[5237]: Accepted publickey for core from 4.153.228.146 port 41906 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:33.373000 audit[5237]: USER_ACCT pid=5237 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.379557 sshd-session[5237]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:33.382989 kernel: audit: type=1101 audit(1769193213.373:863): pid=5237 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.376000 audit[5237]: CRED_ACQ pid=5237 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.391355 kernel: audit: type=1103 audit(1769193213.376:864): pid=5237 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.393278 systemd-logind[1647]: New session 20 of user core. Jan 23 18:33:33.399000 kernel: audit: type=1006 audit(1769193213.376:865): pid=5237 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 23 18:33:33.376000 audit[5237]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff9e8d5e00 a2=3 a3=0 items=0 ppid=1 pid=5237 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:33.404121 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 23 18:33:33.408543 kernel: audit: type=1300 audit(1769193213.376:865): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff9e8d5e00 a2=3 a3=0 items=0 ppid=1 pid=5237 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:33.376000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:33.414995 kernel: audit: type=1327 audit(1769193213.376:865): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:33.409000 audit[5237]: USER_START pid=5237 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.423991 kernel: audit: type=1105 audit(1769193213.409:866): pid=5237 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.413000 audit[5241]: CRED_ACQ pid=5241 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.434065 kernel: audit: type=1103 audit(1769193213.413:867): pid=5241 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.785945 containerd[1684]: time="2026-01-23T18:33:33.785627671Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 18:33:33.787631 containerd[1684]: time="2026-01-23T18:33:33.787492484Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 18:33:33.787631 containerd[1684]: time="2026-01-23T18:33:33.787553833Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 18:33:33.787967 kubelet[2852]: E0123 18:33:33.787867 2852 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:33:33.787967 kubelet[2852]: E0123 18:33:33.787937 2852 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 18:33:33.788266 kubelet[2852]: E0123 18:33:33.788195 2852 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sc26w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-6z82s_calico-system(f61eaf28-593a-461f-8945-a34eecb93534): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 18:33:33.789759 kubelet[2852]: E0123 18:33:33.789690 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:33:33.879183 sshd[5241]: Connection closed by 4.153.228.146 port 41906 Jan 23 18:33:33.880171 sshd-session[5237]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:33.881000 audit[5237]: USER_END pid=5237 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.888612 systemd[1]: sshd@22-46.62.169.9:22-4.153.228.146:41906.service: Deactivated successfully. Jan 23 18:33:33.893236 systemd[1]: session-20.scope: Deactivated successfully. Jan 23 18:33:33.898498 systemd-logind[1647]: Session 20 logged out. Waiting for processes to exit. Jan 23 18:33:33.899571 systemd-logind[1647]: Removed session 20. Jan 23 18:33:33.900083 kernel: audit: type=1106 audit(1769193213.881:868): pid=5237 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.882000 audit[5237]: CRED_DISP pid=5237 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:33.888000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-46.62.169.9:22-4.153.228.146:41906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:33.916081 kernel: audit: type=1104 audit(1769193213.882:869): pid=5237 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:34.913503 kubelet[2852]: E0123 18:33:34.913433 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:33:39.023346 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:39.023499 kernel: audit: type=1130 audit(1769193219.017:871): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-46.62.169.9:22-4.153.228.146:37132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:39.017000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-46.62.169.9:22-4.153.228.146:37132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:39.018251 systemd[1]: Started sshd@23-46.62.169.9:22-4.153.228.146:37132.service - OpenSSH per-connection server daemon (4.153.228.146:37132). Jan 23 18:33:39.714000 audit[5254]: USER_ACCT pid=5254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:39.721619 kernel: audit: type=1101 audit(1769193219.714:872): pid=5254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:39.721676 sshd[5254]: Accepted publickey for core from 4.153.228.146 port 37132 ssh2: RSA SHA256:FsLS6z7i21aNxkbTri9TFsF1k0iVr3y/E3bcCjvhLFU Jan 23 18:33:39.722848 sshd-session[5254]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 18:33:39.727815 systemd-logind[1647]: New session 21 of user core. Jan 23 18:33:39.720000 audit[5254]: CRED_ACQ pid=5254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:39.736512 kernel: audit: type=1103 audit(1769193219.720:873): pid=5254 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:39.736558 kernel: audit: type=1006 audit(1769193219.720:874): pid=5254 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 23 18:33:39.720000 audit[5254]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeca574a60 a2=3 a3=0 items=0 ppid=1 pid=5254 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:39.741679 kernel: audit: type=1300 audit(1769193219.720:874): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffeca574a60 a2=3 a3=0 items=0 ppid=1 pid=5254 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:39.742060 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 23 18:33:39.720000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:39.747942 kernel: audit: type=1327 audit(1769193219.720:874): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 18:33:39.745000 audit[5254]: USER_START pid=5254 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:39.752306 kernel: audit: type=1105 audit(1769193219.745:875): pid=5254 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:39.749000 audit[5258]: CRED_ACQ pid=5258 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:39.759638 kernel: audit: type=1103 audit(1769193219.749:876): pid=5258 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:39.913881 kubelet[2852]: E0123 18:33:39.913766 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:33:40.208067 sshd[5258]: Connection closed by 4.153.228.146 port 37132 Jan 23 18:33:40.209316 sshd-session[5254]: pam_unix(sshd:session): session closed for user core Jan 23 18:33:40.213000 audit[5254]: USER_END pid=5254 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:40.233727 kernel: audit: type=1106 audit(1769193220.213:877): pid=5254 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:40.213000 audit[5254]: CRED_DISP pid=5254 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:40.238743 systemd[1]: sshd@23-46.62.169.9:22-4.153.228.146:37132.service: Deactivated successfully. Jan 23 18:33:40.243921 systemd[1]: session-21.scope: Deactivated successfully. Jan 23 18:33:40.248223 kernel: audit: type=1104 audit(1769193220.213:878): pid=5254 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 18:33:40.247088 systemd-logind[1647]: Session 21 logged out. Waiting for processes to exit. Jan 23 18:33:40.253058 systemd-logind[1647]: Removed session 21. Jan 23 18:33:40.239000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-46.62.169.9:22-4.153.228.146:37132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 18:33:42.911583 kubelet[2852]: E0123 18:33:42.911400 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:33:42.912276 kubelet[2852]: E0123 18:33:42.911822 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:33:45.914897 kubelet[2852]: E0123 18:33:45.914864 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:33:46.913048 kubelet[2852]: E0123 18:33:46.912684 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:33:49.911124 kubelet[2852]: E0123 18:33:49.911015 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:33:51.911681 kubelet[2852]: E0123 18:33:51.911623 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:33:53.914308 kubelet[2852]: E0123 18:33:53.914228 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:33:56.691363 systemd[1]: cri-containerd-71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3.scope: Deactivated successfully. Jan 23 18:33:56.692138 systemd[1]: cri-containerd-71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3.scope: Consumed 17.421s CPU time, 118M memory peak. Jan 23 18:33:56.694846 containerd[1684]: time="2026-01-23T18:33:56.694798762Z" level=info msg="received container exit event container_id:\"71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3\" id:\"71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3\" pid:3172 exit_status:1 exited_at:{seconds:1769193236 nanos:694147564}" Jan 23 18:33:56.696000 audit: BPF prog-id=146 op=UNLOAD Jan 23 18:33:56.699012 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 18:33:56.699138 kernel: audit: type=1334 audit(1769193236.696:880): prog-id=146 op=UNLOAD Jan 23 18:33:56.696000 audit: BPF prog-id=150 op=UNLOAD Jan 23 18:33:56.711039 kernel: audit: type=1334 audit(1769193236.696:881): prog-id=150 op=UNLOAD Jan 23 18:33:56.744626 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3-rootfs.mount: Deactivated successfully. Jan 23 18:33:56.912670 kubelet[2852]: E0123 18:33:56.912599 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:33:57.138145 kubelet[2852]: E0123 18:33:57.138053 2852 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:57480->10.0.0.2:2379: read: connection timed out" Jan 23 18:33:57.391584 kubelet[2852]: I0123 18:33:57.391298 2852 status_manager.go:890] "Failed to get status for pod" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:57378->10.0.0.2:2379: read: connection timed out" Jan 23 18:33:57.392019 kubelet[2852]: E0123 18:33:57.391279 2852 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:57240->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-548cd799bc-4tqh9.188d6fc4e99d10d0 calico-apiserver 1566 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-548cd799bc-4tqh9,UID:7830dd1a-252c-46ff-ba84-ba7feca691a8,APIVersion:v1,ResourceVersion:833,FieldPath:spec.containers{calico-apiserver},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4547-1-0-c-e2d32aff86,},FirstTimestamp:2026-01-23 18:32:00 +0000 UTC,LastTimestamp:2026-01-23 18:33:46.912515067 +0000 UTC m=+157.105549037,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-1-0-c-e2d32aff86,}" Jan 23 18:33:57.460728 kubelet[2852]: I0123 18:33:57.460645 2852 scope.go:117] "RemoveContainer" containerID="71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3" Jan 23 18:33:57.463683 containerd[1684]: time="2026-01-23T18:33:57.463599271Z" level=info msg="CreateContainer within sandbox \"d7f4ac683b9ea5d2a2db7684c00b06a7eece23cac453fa1ac79e19c2b321cad5\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 23 18:33:57.476659 containerd[1684]: time="2026-01-23T18:33:57.476604784Z" level=info msg="Container 80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:33:57.492236 containerd[1684]: time="2026-01-23T18:33:57.492169921Z" level=info msg="CreateContainer within sandbox \"d7f4ac683b9ea5d2a2db7684c00b06a7eece23cac453fa1ac79e19c2b321cad5\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc\"" Jan 23 18:33:57.493731 containerd[1684]: time="2026-01-23T18:33:57.493466697Z" level=info msg="StartContainer for \"80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc\"" Jan 23 18:33:57.494656 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount837790338.mount: Deactivated successfully. Jan 23 18:33:57.496660 containerd[1684]: time="2026-01-23T18:33:57.496397948Z" level=info msg="connecting to shim 80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc" address="unix:///run/containerd/s/ef040c12e9c3e533f13edac7ae2a78f3c6bc1a11d0df1b955880e098af1405bd" protocol=ttrpc version=3 Jan 23 18:33:57.534257 systemd[1]: Started cri-containerd-80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc.scope - libcontainer container 80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc. Jan 23 18:33:57.560000 audit: BPF prog-id=256 op=LOAD Jan 23 18:33:57.567917 kernel: audit: type=1334 audit(1769193237.560:882): prog-id=256 op=LOAD Jan 23 18:33:57.568082 kernel: audit: type=1334 audit(1769193237.564:883): prog-id=257 op=LOAD Jan 23 18:33:57.564000 audit: BPF prog-id=257 op=LOAD Jan 23 18:33:57.564000 audit[5308]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2973 pid=5308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:57.574154 kernel: audit: type=1300 audit(1769193237.564:883): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2973 pid=5308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:57.587393 kernel: audit: type=1327 audit(1769193237.564:883): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830653836356330643230663666353766353437633864656335396530 Jan 23 18:33:57.564000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830653836356330643230663666353766353437633864656335396530 Jan 23 18:33:57.604412 kernel: audit: type=1334 audit(1769193237.564:884): prog-id=257 op=UNLOAD Jan 23 18:33:57.564000 audit: BPF prog-id=257 op=UNLOAD Jan 23 18:33:57.564000 audit[5308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=5308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:57.630037 kernel: audit: type=1300 audit(1769193237.564:884): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=5308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:57.630142 kernel: audit: type=1327 audit(1769193237.564:884): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830653836356330643230663666353766353437633864656335396530 Jan 23 18:33:57.564000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830653836356330643230663666353766353437633864656335396530 Jan 23 18:33:57.564000 audit: BPF prog-id=258 op=LOAD Jan 23 18:33:57.633136 kernel: audit: type=1334 audit(1769193237.564:885): prog-id=258 op=LOAD Jan 23 18:33:57.564000 audit[5308]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2973 pid=5308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:57.564000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830653836356330643230663666353766353437633864656335396530 Jan 23 18:33:57.564000 audit: BPF prog-id=259 op=LOAD Jan 23 18:33:57.564000 audit[5308]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2973 pid=5308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:57.564000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830653836356330643230663666353766353437633864656335396530 Jan 23 18:33:57.564000 audit: BPF prog-id=259 op=UNLOAD Jan 23 18:33:57.564000 audit[5308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=5308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:57.564000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830653836356330643230663666353766353437633864656335396530 Jan 23 18:33:57.564000 audit: BPF prog-id=258 op=UNLOAD Jan 23 18:33:57.564000 audit[5308]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2973 pid=5308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:57.564000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830653836356330643230663666353766353437633864656335396530 Jan 23 18:33:57.564000 audit: BPF prog-id=260 op=LOAD Jan 23 18:33:57.564000 audit[5308]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2973 pid=5308 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:57.564000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3830653836356330643230663666353766353437633864656335396530 Jan 23 18:33:57.643311 containerd[1684]: time="2026-01-23T18:33:57.643189799Z" level=info msg="StartContainer for \"80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc\" returns successfully" Jan 23 18:33:57.913565 kubelet[2852]: E0123 18:33:57.913390 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:33:58.147331 systemd[1]: cri-containerd-34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6.scope: Deactivated successfully. Jan 23 18:33:58.148617 systemd[1]: cri-containerd-34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6.scope: Consumed 3.546s CPU time, 62.7M memory peak, 128K read from disk. Jan 23 18:33:58.150000 audit: BPF prog-id=261 op=LOAD Jan 23 18:33:58.150000 audit: BPF prog-id=88 op=UNLOAD Jan 23 18:33:58.151000 audit: BPF prog-id=108 op=UNLOAD Jan 23 18:33:58.151000 audit: BPF prog-id=112 op=UNLOAD Jan 23 18:33:58.154761 containerd[1684]: time="2026-01-23T18:33:58.154672118Z" level=info msg="received container exit event container_id:\"34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6\" id:\"34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6\" pid:2709 exit_status:1 exited_at:{seconds:1769193238 nanos:153662131}" Jan 23 18:33:58.212125 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6-rootfs.mount: Deactivated successfully. Jan 23 18:33:58.466755 kubelet[2852]: I0123 18:33:58.466582 2852 scope.go:117] "RemoveContainer" containerID="34552e2cdddcb2f220fe2fe9c8b5d075d6a7dec203202f9625dd41b222505ac6" Jan 23 18:33:58.470869 containerd[1684]: time="2026-01-23T18:33:58.470803328Z" level=info msg="CreateContainer within sandbox \"122948b6e8b4fe1f4fbad1128bb1e7b50757b209e120520b5b23c8b3b3d015dc\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 23 18:33:58.484021 containerd[1684]: time="2026-01-23T18:33:58.483665332Z" level=info msg="Container 3157d9b0edcba9c1cdc5cc36c3ebfa01255d71f4a2d2efee27dcd3df7e6c6f0c: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:33:58.499965 containerd[1684]: time="2026-01-23T18:33:58.499908947Z" level=info msg="CreateContainer within sandbox \"122948b6e8b4fe1f4fbad1128bb1e7b50757b209e120520b5b23c8b3b3d015dc\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"3157d9b0edcba9c1cdc5cc36c3ebfa01255d71f4a2d2efee27dcd3df7e6c6f0c\"" Jan 23 18:33:58.501375 containerd[1684]: time="2026-01-23T18:33:58.500961795Z" level=info msg="StartContainer for \"3157d9b0edcba9c1cdc5cc36c3ebfa01255d71f4a2d2efee27dcd3df7e6c6f0c\"" Jan 23 18:33:58.503780 containerd[1684]: time="2026-01-23T18:33:58.503702637Z" level=info msg="connecting to shim 3157d9b0edcba9c1cdc5cc36c3ebfa01255d71f4a2d2efee27dcd3df7e6c6f0c" address="unix:///run/containerd/s/063034fe046166f4557eac37cab47473d2a5b3ecaa810e6d23d522aa7e27c73c" protocol=ttrpc version=3 Jan 23 18:33:58.540320 systemd[1]: Started cri-containerd-3157d9b0edcba9c1cdc5cc36c3ebfa01255d71f4a2d2efee27dcd3df7e6c6f0c.scope - libcontainer container 3157d9b0edcba9c1cdc5cc36c3ebfa01255d71f4a2d2efee27dcd3df7e6c6f0c. Jan 23 18:33:58.566000 audit: BPF prog-id=262 op=LOAD Jan 23 18:33:58.567000 audit: BPF prog-id=263 op=LOAD Jan 23 18:33:58.567000 audit[5354]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2558 pid=5354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:58.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331353764396230656463626139633163646335636333366333656266 Jan 23 18:33:58.567000 audit: BPF prog-id=263 op=UNLOAD Jan 23 18:33:58.567000 audit[5354]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2558 pid=5354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:58.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331353764396230656463626139633163646335636333366333656266 Jan 23 18:33:58.567000 audit: BPF prog-id=264 op=LOAD Jan 23 18:33:58.567000 audit[5354]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2558 pid=5354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:58.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331353764396230656463626139633163646335636333366333656266 Jan 23 18:33:58.568000 audit: BPF prog-id=265 op=LOAD Jan 23 18:33:58.568000 audit[5354]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2558 pid=5354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:58.568000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331353764396230656463626139633163646335636333366333656266 Jan 23 18:33:58.568000 audit: BPF prog-id=265 op=UNLOAD Jan 23 18:33:58.568000 audit[5354]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2558 pid=5354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:58.568000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331353764396230656463626139633163646335636333366333656266 Jan 23 18:33:58.568000 audit: BPF prog-id=264 op=UNLOAD Jan 23 18:33:58.568000 audit[5354]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2558 pid=5354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:58.568000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331353764396230656463626139633163646335636333366333656266 Jan 23 18:33:58.568000 audit: BPF prog-id=266 op=LOAD Jan 23 18:33:58.568000 audit[5354]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2558 pid=5354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:33:58.568000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331353764396230656463626139633163646335636333366333656266 Jan 23 18:33:58.635282 containerd[1684]: time="2026-01-23T18:33:58.635192905Z" level=info msg="StartContainer for \"3157d9b0edcba9c1cdc5cc36c3ebfa01255d71f4a2d2efee27dcd3df7e6c6f0c\" returns successfully" Jan 23 18:33:59.912095 kubelet[2852]: E0123 18:33:59.912013 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:34:00.911612 kubelet[2852]: E0123 18:34:00.911533 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:34:02.799022 systemd[1]: cri-containerd-97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565.scope: Deactivated successfully. Jan 23 18:34:02.800274 systemd[1]: cri-containerd-97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565.scope: Consumed 2.213s CPU time, 20.1M memory peak, 68K read from disk. Jan 23 18:34:02.802847 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 23 18:34:02.803323 kernel: audit: type=1334 audit(1769193242.799:902): prog-id=267 op=LOAD Jan 23 18:34:02.799000 audit: BPF prog-id=267 op=LOAD Jan 23 18:34:02.806126 containerd[1684]: time="2026-01-23T18:34:02.806077555Z" level=info msg="received container exit event container_id:\"97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565\" id:\"97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565\" pid:2691 exit_status:1 exited_at:{seconds:1769193242 nanos:801134628}" Jan 23 18:34:02.799000 audit: BPF prog-id=93 op=UNLOAD Jan 23 18:34:02.804000 audit: BPF prog-id=103 op=UNLOAD Jan 23 18:34:02.809608 kernel: audit: type=1334 audit(1769193242.799:903): prog-id=93 op=UNLOAD Jan 23 18:34:02.809669 kernel: audit: type=1334 audit(1769193242.804:904): prog-id=103 op=UNLOAD Jan 23 18:34:02.812075 kernel: audit: type=1334 audit(1769193242.804:905): prog-id=107 op=UNLOAD Jan 23 18:34:02.804000 audit: BPF prog-id=107 op=UNLOAD Jan 23 18:34:02.846816 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565-rootfs.mount: Deactivated successfully. Jan 23 18:34:02.910874 kubelet[2852]: E0123 18:34:02.910844 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:34:03.497498 kubelet[2852]: I0123 18:34:03.497394 2852 scope.go:117] "RemoveContainer" containerID="97401b6b5e5fc652a29045400076507002eb8dd08f3844eda4ad3eefd747b565" Jan 23 18:34:03.502148 containerd[1684]: time="2026-01-23T18:34:03.500966215Z" level=info msg="CreateContainer within sandbox \"bb7960dc02fea21ad0fcf26cf8e1de2be2a94cc06c246024c736886d9737c19a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 23 18:34:03.520273 containerd[1684]: time="2026-01-23T18:34:03.520205494Z" level=info msg="Container 260c480a07b9c0c9826cbc32cc4a13f99b92ada85f083ee95ba75ad8cb11d81e: CDI devices from CRI Config.CDIDevices: []" Jan 23 18:34:03.539362 containerd[1684]: time="2026-01-23T18:34:03.539316994Z" level=info msg="CreateContainer within sandbox \"bb7960dc02fea21ad0fcf26cf8e1de2be2a94cc06c246024c736886d9737c19a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"260c480a07b9c0c9826cbc32cc4a13f99b92ada85f083ee95ba75ad8cb11d81e\"" Jan 23 18:34:03.540153 containerd[1684]: time="2026-01-23T18:34:03.540125032Z" level=info msg="StartContainer for \"260c480a07b9c0c9826cbc32cc4a13f99b92ada85f083ee95ba75ad8cb11d81e\"" Jan 23 18:34:03.542419 containerd[1684]: time="2026-01-23T18:34:03.542346286Z" level=info msg="connecting to shim 260c480a07b9c0c9826cbc32cc4a13f99b92ada85f083ee95ba75ad8cb11d81e" address="unix:///run/containerd/s/37559ca7bd6eb011ab10a6f4523370c88f7cd22c0df7658fef01a07f7971f05b" protocol=ttrpc version=3 Jan 23 18:34:03.590133 systemd[1]: Started cri-containerd-260c480a07b9c0c9826cbc32cc4a13f99b92ada85f083ee95ba75ad8cb11d81e.scope - libcontainer container 260c480a07b9c0c9826cbc32cc4a13f99b92ada85f083ee95ba75ad8cb11d81e. Jan 23 18:34:03.608000 audit: BPF prog-id=268 op=LOAD Jan 23 18:34:03.613065 kernel: audit: type=1334 audit(1769193243.608:906): prog-id=268 op=LOAD Jan 23 18:34:03.612000 audit: BPF prog-id=269 op=LOAD Jan 23 18:34:03.612000 audit[5401]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2593 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:03.619944 kernel: audit: type=1334 audit(1769193243.612:907): prog-id=269 op=LOAD Jan 23 18:34:03.620051 kernel: audit: type=1300 audit(1769193243.612:907): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2593 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:03.638405 kernel: audit: type=1327 audit(1769193243.612:907): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236306334383061303762396330633938323663626333326363346131 Jan 23 18:34:03.612000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236306334383061303762396330633938323663626333326363346131 Jan 23 18:34:03.641819 kernel: audit: type=1334 audit(1769193243.612:908): prog-id=269 op=UNLOAD Jan 23 18:34:03.612000 audit: BPF prog-id=269 op=UNLOAD Jan 23 18:34:03.652664 kernel: audit: type=1300 audit(1769193243.612:908): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:03.612000 audit[5401]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:03.612000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236306334383061303762396330633938323663626333326363346131 Jan 23 18:34:03.612000 audit: BPF prog-id=270 op=LOAD Jan 23 18:34:03.612000 audit[5401]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2593 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:03.612000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236306334383061303762396330633938323663626333326363346131 Jan 23 18:34:03.612000 audit: BPF prog-id=271 op=LOAD Jan 23 18:34:03.612000 audit[5401]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2593 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:03.612000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236306334383061303762396330633938323663626333326363346131 Jan 23 18:34:03.612000 audit: BPF prog-id=271 op=UNLOAD Jan 23 18:34:03.612000 audit[5401]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:03.612000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236306334383061303762396330633938323663626333326363346131 Jan 23 18:34:03.612000 audit: BPF prog-id=270 op=UNLOAD Jan 23 18:34:03.612000 audit[5401]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2593 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:03.612000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236306334383061303762396330633938323663626333326363346131 Jan 23 18:34:03.612000 audit: BPF prog-id=272 op=LOAD Jan 23 18:34:03.612000 audit[5401]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2593 pid=5401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 18:34:03.612000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3236306334383061303762396330633938323663626333326363346131 Jan 23 18:34:03.698023 containerd[1684]: time="2026-01-23T18:34:03.697058433Z" level=info msg="StartContainer for \"260c480a07b9c0c9826cbc32cc4a13f99b92ada85f083ee95ba75ad8cb11d81e\" returns successfully" Jan 23 18:34:07.139904 kubelet[2852]: E0123 18:34:07.139330 2852 controller.go:195] "Failed to update lease" err="Put \"https://46.62.169.9:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-c-e2d32aff86?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 23 18:34:07.911922 kubelet[2852]: E0123 18:34:07.911833 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-6z82s" podUID="f61eaf28-593a-461f-8945-a34eecb93534" Jan 23 18:34:08.860882 systemd[1]: cri-containerd-80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc.scope: Deactivated successfully. Jan 23 18:34:08.862447 containerd[1684]: time="2026-01-23T18:34:08.862390049Z" level=info msg="received container exit event container_id:\"80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc\" id:\"80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc\" pid:5320 exit_status:1 exited_at:{seconds:1769193248 nanos:861841780}" Jan 23 18:34:08.864000 audit: BPF prog-id=256 op=UNLOAD Jan 23 18:34:08.866589 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 23 18:34:08.866771 kernel: audit: type=1334 audit(1769193248.864:914): prog-id=256 op=UNLOAD Jan 23 18:34:08.864000 audit: BPF prog-id=260 op=UNLOAD Jan 23 18:34:08.873684 kernel: audit: type=1334 audit(1769193248.864:915): prog-id=260 op=UNLOAD Jan 23 18:34:08.914299 kubelet[2852]: E0123 18:34:08.914161 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7b5b4545c8-4tbgk" podUID="e698d847-3c19-47a7-986c-5552a2964f3f" Jan 23 18:34:08.923458 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc-rootfs.mount: Deactivated successfully. Jan 23 18:34:09.519667 kubelet[2852]: I0123 18:34:09.519628 2852 scope.go:117] "RemoveContainer" containerID="71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3" Jan 23 18:34:09.520618 kubelet[2852]: I0123 18:34:09.520351 2852 scope.go:117] "RemoveContainer" containerID="80e865c0d20f6f57f547c8dec59e0125e34a801f3e04c3d9fef9796e7acabafc" Jan 23 18:34:09.520618 kubelet[2852]: E0123 18:34:09.520551 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-pmgq6_tigera-operator(a35443c2-716a-4cc2-b350-77c80906f366)\"" pod="tigera-operator/tigera-operator-7dcd859c48-pmgq6" podUID="a35443c2-716a-4cc2-b350-77c80906f366" Jan 23 18:34:09.522501 containerd[1684]: time="2026-01-23T18:34:09.522460027Z" level=info msg="RemoveContainer for \"71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3\"" Jan 23 18:34:09.530077 containerd[1684]: time="2026-01-23T18:34:09.530036888Z" level=info msg="RemoveContainer for \"71a12e8d0e3ec940852e4a5dbdc7beb96efd9d3a886911e69047eadb6a53e7d3\" returns successfully" Jan 23 18:34:09.911670 kubelet[2852]: E0123 18:34:09.911570 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-dw7bl" podUID="8cfde420-1102-4e10-b36c-f5766b852cc7" Jan 23 18:34:11.912009 kubelet[2852]: E0123 18:34:11.911780 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-4tqh9" podUID="7830dd1a-252c-46ff-ba84-ba7feca691a8" Jan 23 18:34:14.911603 kubelet[2852]: E0123 18:34:14.911517 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-69d9c854f8-4vpsg" podUID="4e218aa3-82e2-43da-97a5-0f48de07a97f" Jan 23 18:34:14.911603 kubelet[2852]: E0123 18:34:14.911586 2852 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-548cd799bc-rm88n" podUID="8bfceda8-a381-44e3-ab5c-8d6cd2f190e0" Jan 23 18:34:17.141179 kubelet[2852]: E0123 18:34:17.140903 2852 controller.go:195] "Failed to update lease" err="Put \"https://46.62.169.9:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-c-e2d32aff86?timeout=10s\": context deadline exceeded"